The problem with manual optimization and what to do instead

Missed a session at the Data Summit? View on demand here.

This article is contributed by Jeremy Fain, CEO and co-founder of Cognitiv.

Manual optimization, as marketers know all too well, is a hassle. It’s hard, stressful, tedious work that takes forever – and just when you think you’ve finally cracked the code, something happens and your entire strategy has to be reconfigured. But despite the inefficiencies, manual optimization is still the preferred process of most advertisers. Why is that and what can advertisers do to make their optimization process more efficient?

To understand why marketers choose to optimize manually, we first need to look at the evolution of programmatic. When it first hit the market, programmatic was groundbreaking because it allowed both advertisers and publishers to automate their media buying and advertisers the ability to purchase ad inventory quickly and at lower prices. In theory, these lower prices were made possible by the lack of intermediaries involved in the buying and selling process.

Of course, theory and reality often diverge completely. While programmatic made it easier for advertisers to buy ad placements and publishers to sell them, it also created ample opportunities for fraud — not to mention that publishers and marketers were constantly hit by hidden technical fees. In addition, advertisers struggled to determine where to advertise, leading to brand safety concerns.

Thanks to this turn of events, the self-service platform was born. Now, instead of handing over the buying of media to an algorithm that can adapt in real time, advertisers make those decisions for themselves, using tools like Excel spreadsheets and, if they’re advanced, basic decision trees to find out. what to do next . While self-service gives advertisers greater transparency and control over ad placement, it comes with its own set of issues. In some ways, it has made marketers’ jobs even harder by requiring merchants to be data scientists without any data science experience.

For example, advertisers have to work their way through massive amounts of data if they really want to find the right patterns to target. This type of data analysis takes a long time if done right – and traders have neither the time nor the resources to do it right. By the time the analysis is complete, the findings will no longer be relevant or the new tactic will only work for a short period of time. To leverage these insights effectively, marketers need to be able to implement them in real time, which is virtually impossible with manual work.

Another problem with manual optimization is that it is inconsistent. You may come across an effective strategy one day, only to try it again the next day and find that your results are completely different. This is generally because the patterns found are due to proxy data that hides the true reason the ads work. Manual optimization is a constant, inefficient game of trial and error and prevents advertisers from scaling effectively. In other words, manual optimization forces advertisers to work in the short term, leaving little room to map out a long-term strategy.

What advertisers need is a balance between the first iteration of programmatic – programmatic 1.0 – and the mess that has become self-service. In other words, they need to bring back some degree of automation to make their lives easier and improve results, but without sacrificing transparency.

So the emergence of a new way of buying: the automated custom algorithm. There are a few companies doing this now, but to really take advantage of data and find the scale needed for long-term success, advertisers need the power of deep learning. Deep learning algorithms can make decisions in real time about which placements to buy and which to avoid. This approach has several advantages: namely, the fact that deep learning is able to perform data analysis and make predictions at a much more sophisticated level than humans can – and like humans – can learn from its mistakes and avoid making them again. to be made .

Deep learning has changed the way we interact with technology in recent years: self-driving cars, Instagram effects, augmented reality and talking to your devices are all examples of technology that enables deep learning. For advertisers, deep learning means they no longer have to worry about delivering performance at scale; as the algorithm learns more about how audience members behave, it can improve its predictions, leading to a reduction in costs and a higher ROI over time. Finally, deep learning algorithms can optimize buying decisions in real time, giving traders the time and freedom to effectively oversee all their campaigns and focus on general strategies rather than day-to-day tactics.

To put it plainly, there is really no longer a need for marketers to do manual optimization. It’s inefficient, expensive and ineffective, and everyone’s time would be much better spent elsewhere. It’s time for us as an industry to embrace the next stage in the evolution of programmatic – one that makes life easier for marketers while still delivering results.

Jeremy Fain is the CEO and co-founder of Cognitiv.

DataDecision makers

Welcome to the VentureBeat Community!

DataDecisionMakers is where experts, including the technical people who do data work, can share data-related insights and innovation.

If you want to read about the very latest ideas and up-to-date information, best practices and the future of data and data technology, join us at DataDecisionMakers.

You might even consider contributing an article yourself!

Read more from DataDecisionMakers

This post The problem with manual optimization and what to do instead

was original published at “”