Automation Nation

Ethan Mollick, Associate Professor at the Wharton School, and author of Co-Intelligence, recently published a post on the advances of AI in the last 21 months.

Mollick is good on the creative possibilities of AI - and writes about it with a sense of realism in the ‘this is here, how are you going to make the most of it’ kind of way.

Mollick on experts: ‘a big change happens when they stop feeling relieved or smug when AI can’t do something and instead try to figure out how to make the AI succeed.’

It’s a reasonable position, how I learned to stop worrying and love the bomb, but there is something that remains awkward about the formulation ‘how to make the AI succeed’.

Success - whether monetary or artistic - remains subjective.

In the field of sales, I noticed another ‘thought leader’ decrying that AI would be the ‘death of outbound’.

Sales leaders are looking at ways to literally turn AI into never-off sales or business development reps - the first rung of the sales chain, those who try and get the meetings to kick up to the ‘account execs’ (the bastards).

Enter Clay - a program that is gaining traction amongst LinkedIn nerds - Clay promises to do the personalisation bit for you in outreach, by pulling data from an endless set of fields, LinkedIn profiles, podcasts, and crafting ‘highly personalised’ outreach messages that sales reps might once have sweated over.

Which seems to point to a fork in the road.

On one side, there’s a drive to automate, slash costs, and increase existing activity to levels never seen before.

On the other hand, there’s a sense of using AI to do the thoughtful work that humans once did - and people can become the ‘orchestrators’ of said automation - augmentation not imitation.

The first activity seems to be the kind of activity that experts might despair at, the second an instance of stopping the despair, and figuring out how to make the AI succeed.

Both have maximalist tendencies - neither seem to question the potential costs of their reach (like bitcoin mining before - AI leaves a large Carbon footprint). It seems reasonable to post, that whichever side of the coin you land on, the automation of either action or thought will come at some cost.

Perhaps we’ll be able to automate environmental regulation at some point in the not too distant future however?

After all - it looks like we’re going to need to make these machines succeed for us somehow.

Previous
Previous

Is your communication strategy good, or f*****g great?

Next
Next

I literally HATE ads for A.I.