Category Archives: Media Mentions

AI and Computer Vision Key Trend for 2018

Tom Edwards, Ad Age Marketing Technology Trailblazer and Chief Digital Officer, Agency @ Epsilon discusses how computer vision, powered by artificial intelligence will be the key trend to watch in 2018.

I recently provided commentary to CMO.com discussing 7 digital trends that will change business forever. My contribution to the piece was tied to  the potential impact of computer vision powered experiences and this led me to create this video.

It starts with an analysis of all facets of computer vision and how it will impact consumer experience, it’s role within multimodal computing, how machine learning is accelerating our ability to categorize visual information and I discuss the shift from mobile first to AI first.

The video then investigates computer vision’s implications and opportunities for marketers through image and object recognition, the camera as a platform, contextual environments and redefining personalized advertising, the alignment of virtual assistants and computer vision and computer vision’s role in the mass adoption of augmented reality.

The video ends with a discussion and research findings tied to the potential impact of computer vision driven experiences and what it means for Gen Z while also exploring the rapidly developing computer vision ecosystem.

Follow Tom Edwards @BlackFin360

Find “BlackFin360” via Alexa

Google I/O 2017 Full Recap

This week I had the opportunity to attend the Google I/O conference in Mountain View, California. It was an incredibly compelling event as Google shifted their focus as a company from mobile first to AI first. This means that all products will be redefined and enhanced through various forms of AI.

This includes the Google Assistant, which was the star of the show. The deck goes into detail, but it’s incredibly important that we begin thinking about the role that the Google Assistant plays across home, smartphone, wearables, auto and soon AR. With the launch on the iPhone announced at the conference it gives Assistant 200 million voice enabled devices out of the gate.

What is also key to consider is the Google Assistant equivalent of an Alexa Skill, called an Action by Google. Actions can support transactions outside of Amazon as well as not requiring installation. Also, there is a very small number of actions that exist today, but a huge and rapidly growing ecosystem of devices that are Google Assistant enabled.

Here is the full trend recap and analysis:

Section one covers trends tied to connection & cognition:

  • Vision of Ubiquitous Computing
  • Multi-Modal Computing
  • Google Assistant (Actions, Auto, Computer Vision, Wear)
  • Android O
  • Progressive Web Apps
  • Structured Data & Search

Section two covers all facets of immersive computing:

  • Immersive Computing
  • Daydream (Virtual Reality)
  • Social VR
  • WebVR
  • Visual Positioning Services
  • Tango (Augmented Reality) 
  • WebAR

In addition to the attached recap, there is also a 4 minute “light recap” video:

For third party commentary, discussed the role of Google Lens & Computer Vision with AdExchanger here

Follow Tom Edwards @BlackFin360

In The News: Advertising Age Virtual Reality

I recently provided commentary to Advertising Age discussing the potential impact of Facebook shutting down it’s VR studio.

Here is my full commentary:

Facebook shut down its VR studio. What kind of message do you think this sends to marketers, brands?

I take it as a good sign that Facebook is divesting in original content and focusing on external creators. Facebook is betting on the democratization of VR vs. being the originators of content. This holds true to all of their platforms as they are the enablers of the experiences versus the creators of experiences.

Can you explain to our audience why VR isn’t seeing the explosive growth many were predicting two or three years ago?

The barrier to consumption of content through various headsets, and the lack of 360 degree cameras that are readily available to create immersive content, may be why we are not seeing explosive growth. The key for any new technology, especially one like VR is to empower the masses to create their own experiences. This is why we see Facebook shifting towards the camera as the first augmented reality platform, as it’s built on behaviors consumers already engage with.

What do you think we’ll see next from VR?

Democratization of VR is the key to truly unlocking the potential of VR. Once 360 degree cameras are integrated into phones or more readily available we will see acceleration around the creation of VR content. This combined with the rise of more experiences that drive connection, such as Facebook’s VR based Spaces.

Is there an area you feel VR will see growth – near future?

I see more opportunity to redefine how we engage with on demand entertainment and sporting events. Having the ability to control and enhance live sports through contextual hotspots, allowing the consumer to control camera angles, as well as enhanced data to support the experience, such as stats and co-viewing with friends, could be a key growth area for VR.

Anything else that you would like to share?

AR will play key roles in the near future. AR will impact our everyday lives and enhance our environments while VR will shift to more immersive, entertainment and connection with friends and family.

Follow Tom Edwards @BlackFin360

In The News: AdExchanger & F8

I had an opportunity to sit down with AdExchanger during Facebook’s F8 developer conference.

We discussed how Facebook’s new focus on Augmented Reality through camera effects can impact the future of marketing. 

From the creation of effect based advertising and the intersection of artificial intelligence and the role of data. 


Read the full article here.

Follow Tom Edwards @BlackFin360

Advertising Age Marketing Technology Trailblazer

Today Advertising Age announced their 2017 list of top 25 Marketing Technology Trailblazers and I am honored to be included.


Photo by Bradley Taylor, Caprock Studio 

A big thank you to the Epsilon corporate communications team, DGC and Advertising Age judges. I am truly humbled by the inclusion with such a great list of industry innovators.

I am incredibly grateful to my data design strategy and innovation teams. From research, planning, data design, digital strategy, digital experience delivery, social and innovation a huge thank you for all that you do.

I also want to thank Richard McDonald and the Epsilon agency leadership team for your continued support. Richard, it was your vision that sold me on joining Epsilon and its one of the best career decisions I have made.

Tom Edwards AdAge

Finally, a very special thank you to my amazing wife Cherlyn for supporting all the crazy hours and travel for the past 17 years.

Follow Tom Edwards @BlackFin360

In The News: eMarketer Wearables Forecast

I was recently interviewed by eMarketer about wearables in 2017 and how they are trending for marketers as they evaluated the future forecast of wearables.

The full report is available to eMarketer PRO subscribers.

My summarized commentary is that most of the client demand I have experienced over the past few years has been web and mobile centric.

Over the years I have focused on the intersection of wearables, and the data that’s created and how that can refine a more personalized experience. But the reality is that most wearables are simple extensions of a mobile device and that limits their value to marketers.

Most of the wearable based programs I have been a part of were focused more on the data created as well as actionable notifications but interest has shifted significantly towards conversational experiences such as chatbots and voice based systems.

The full report is available to eMarketer PRO subscribers.

Follow @BlackFin360

In The News: Campaign Live SXSW 2017

I was recently asked by Campaign Live about my thoughts, reactions and takeaways from SXSW Interactive 2017.

My commentary focused on the shift towards programming vs. experiences at this years event.

Additional Context to the Article Commentary:

2017 may be the year that programming both from an official and 3rd party standpoint was the focal point vs experiences. In previous years you would see major brand installations from the sponsors featuring a mix of products and technology. A lot of traditional SXSW powerhouses such as AT&T, Samsung and Chevy were noticeably absent. 

This year more experiences also featured content tracks. The feel was less amusement park and more like attending TED talks with live demonstrations thrown in. It was an odd feeling as the best word to describe SXSW Interactive this year was subdued. 

SXSW used to be the ideal event to gauge and project consumer behavior-centric tech trends. We saw consumer empowerment and amplification with the launch of Twitter in 2007. We saw the rise of location based engagement with Foursquare in 2009. We saw the rise of live streaming service Meerkat in 2015, and a slew of other disruptive tech over the years. But marketing is quickly shifting from disruptive tech to acceleration through intelligent systems. 

Now It’s less about the latest app fad, and more about how quickly the combination of data, intelligent systems and smart environments are going to fundamentally shift how we interact. This is where SXSW is at a cross-roads moving forward.

Follow Tom Edwards @BlackFin360

In The News: SXSW Hope vs Reality

I was recently asked by the Drum to write an op-ed about my hope vs reality heading into SXSW Interactive 2017.

As a digitally progressive marketer, focusing both on current solutions, while keeping a close watch on the future, I am at a crossroads when it comes to identifying the value I receive from SXSW.

Each year, I have high hopes for the event. I look forward to real discussions about key topics driving digital. I want to be inspired by compelling brand experiences that showcase the latest technology, which may be a precursor to new ways to connect, empower, entertain, or all of the above.

My hopes remain high, but I am afraid of the reality, given my experience as a SXSW attendee the past few years. Instead of deep meaningful discussions, the content, especially outside of keynotes, is either too simplified or so generic it lacks any lasting impact. The other issue is that panels are selected for their title, versus their substance, and more often than not, the content is more opinion-based, rather than truth or research based.

The reality has been painful at times. I used to think about SXSW as the ideal event to gauge and project consumer behavior-centric tech trends. We saw consumer empowerment and amplification with the launch of Twitter in 2007.

We saw the rise of location based engagement with Foursquare in 2009. We saw the rise of live streaming service Meerkat in 2015, and a slew of other disruptive tech over the years.

But marketing is quickly shifting from disruptive tech to acceleration through intelligent systems. It’s less about the latest app fad, and more about how quickly the combination of data, intelligent systems and smart environments are going to fundamentally shift how we interact.

You can read the rest of the article on the Drum here.

Follow Tom Edwards @BlackFin360

In The News: Ad Age Data Design & Alexa

I was recently interviewed by Ad Age discussing the efforts of my data design team and our work with Amazon and the Alexa Skills Kit.

screenshot-2017-02-11-10-46-11

When I first joined the Epsilon agency team I wanted to bridge traditional brand planning, strategy and data science to uniquely assess all of our data sources and build recommendations that leverage the right data to assist planning, strategy development and data-driven insights to support strategy and creative.

Now the agency data design group is comprised of 3 core components: 1) Mapping the data landscape 2) Storytelling through data 3) Consulting & training. My goal with this team is align intelligence from the data, regardless of source, that will inform how we communicate and message with consumers as technology and behaviors evolve and most importantly drive performance.

There are three primary areas of focus for the team:

1) Proprietary data sources & methodologies e.g. Leveraging Epsilon’s structured data

2) Unstructured data sources & methodologies e.g. Finding previously invisible insights by applying machine learning & artificial intelligence to unstructured category data

3) New data sources & methodologies e.g. Uncover new types of data sets that we call affective datasets and how it will impact and reshape how we connect across the consumer journey

screenshot-2016-12-02-14-47-49

Unstructured and New Data sources combined with Epsilon’s proprietary data began to accelerate our processing and analysis capabilities to uncover consumer truths with unstructured data to further fuel our agency’s strategic storytelling and data driven creative leading to an evolution of brand planning.

For the past 12 months my data design team has focused on aligning emerging artificial intelligence systems and algorithms with our structured data assets to combine all of the following elements.

screenshot-2016-12-02-14-52-09

Data Design is the bridge between planning and bleeding edge tools like cognitive computing, artificial intelligence and natural language processing. Ad Age highlighted our approach with Amazon and how we leverage machine learning on amazon.com down to the product SKU level to further inform communication and engagement strategy as well as our team being one of the early adopters of the Alexa Skills Kit (ASK).

screenshot-2017-02-11-10-46-27

Here is an example of data design concepts in action.

screenshot-2017-02-11-10-46-47

Follow Tom Edwards @BlackFin360

7 Ways AI will Enhance Marketing

For the past 12 months, my Epsilon team and I have focused on multiple facets of artificial intelligence (AI) with data as the primary fuel that powers key insights. We have leveraged machine learning, natural language processing, predictive APIs, and neural networks to uncover consumer truths that previously would have taken weeks or months to uncover.Having the opportunity to work with comprehensive, boundless proprietary data assets is incredibly exciting. In addition to fueling strategy work, it also drives emotional connections with consumers, bonding them to brands in meaningful ways. It is the future of marketing.Now past the experimentation phase, I can say confidently that AI will be a key driver of technology growth over the next decade and will significantly impact consumer marketing. Initial predictions show the market for AI-driven products and services will jump from $36 billion in 2020 to $127 billion by 2025*. (*Source: BofA Merrill Lynch Global Research Estimates — 2017 the year ahead: artificial Intelligence; The rise of the machines.)Most AI we work with today is categorized as Artificial Narrow Intelligence (ANI). This means that the AI is extremely adept at executing specific tasks.

Right now, there are seven subsets of artificial intelligence, outlined below. Brand marketers can better uncover insights, connect with consumers, and redefine customer experiences using this innovative technology.

Machine learning (ML)

ML uses human coded computer algorithms based on mathematical models. Probability models then make assumptions and/or predictions about similar data sets.

Currently, machine learning can be leveraged as a service to accelerate sentiment analysis and domain-specific insights. It also serves as a foundational element for identifying consumer behavior based on occasions, perceptions, and attributes to construct themes and trends from unstructured data which represents the thoughts, behaviors, and preferences of consumers taken directly from their online activities.

In 2017 and beyond, I expect more third-party providers will offer ML as a cloud service brands and agencies can leverage to transform products and services into smart objects, able to predict needs and preferences.

Machine learning solutions have allowed my team to align our proprietary structured data assets with unstructured data to combine the best of both worlds. This began to accelerate our processing and analysis capabilities to uncover consumer truths within unstructured data to further fuel our agency’s strategic storytelling.

Cognitive computing

Cognitive computing builds on machine learning using large data sets.

The goal is to create automated IT systems that can solve problems without human intervention. Marketing centric cognitive computing solutions can consist of a single, all-encompassing solution, or be comprised of multiple services that build and scale applications over time.

From a marketing application perspective, cognitive computing-based solutions range from customer experience enhancing chatbots to closed loop systems for tracking media performance.

Bank of America recently launched the Erica bot using AI, cognitive messaging, and predictive analytics to further influence consumers’ ability to create better money habits.

Cognitive computing will be key to unlocking the potential of conversational experiences. As ecosystems continue to rise, many of the 30,000 chatbots on Facebook Messenger are powered by AI services.

Facebook’s own M virtual assistant housed within Messenger will soon come out of beta testing and will incorporate cognitive suggestions based on content of a conversation users are having. The goal is to make Messenger-based interactions more convenient, enabling users to access services without leaving the conversational thread within Messenger.

Speech recognition and natural language processing (NLP)

NLP refers to intelligent systems able to understand written and spoken language just like humans, along with reasoning and context, eventually producing speech and writing. NLP plays an essential role in the creation of conversational experiences.

Voice-based experiences, such as Alexa’s voice services (AVS), will become pervasive over the next few years. It is projected that by 2020, 30 percent of web browsing sessions will happen without a screen.* (*Source: Gartner analysts at Symposium/ITxpo 2016.)

The core of the AVS experience is a combination of automated speech recognition, natural language processing, and a cloud-based AI that comprise a voice-based user experience.

As with most artificial intelligence entities, learning new skills is how personalized and contextual experiences will be created. With Alexa, it is possible to “teach” new conversational elements and interactions through developing skills.

Here is an example from Domino’s pizza that allows consumers to order pizza directly through Alexa voice services.

Alexa skill development is one of the quickest ways for brands to connect with the rapidly growing audience that calls upon Alexa to empower their daily lives.

Fitbit is another brand leveraging Alexa-based skills to extend brand engagement. Traditionally Fitbit users depend on an app to visualize their data. With the Fitbit Alexa skill users can get a quick update on the stats that matter the most without the need of a screen.

Deep learning

Deep learning builds on machine learning using neural networks. Neural networks are statistical models directly inspired by, and partially modeled on, biological neural networks such as the human brain. The use of neural networks is what differentiates deep learning from cognitive computing.

Deep learning is currently redefining Google’s approach to search, and search engine optimization (SEO) will never be the same. Previously, Google search results were based on algorithms defined by a strict set of rules and SEO was based on regression models that looked at past behavior to adjust a given strategy.

With the introduction of RankBrain, Google’s machine learning technology, in 2016, search algorithms are now enhanced with artificial intelligence. Google is now processing roughly 15 percent of daily queries by mixing the core algorithms based on each search type.

The system is adept at analyzing words and phrases that make up a search query. It also decides what additional words and phrases carry similar meaning.

Expect the percentage of search queries handled by AI to significantly increase. Marketers will need to rethink site architecture, content, and the signals being sent via backlinks as the systems continue to learn on a query-by-query basis.

Predictive application programming interfaces (APIs)

A predictive API uses AI to provide access to predictive models, or expose access to an ability to learn and create new models.

Fortune 500 company USAA is analyzing thousands of factors to match broad patterns of customer behavior through its intelligent virtual assistant Nina.

As we shift from consumers using technology to technology enhancing consumers, predictive APIs will play a key role in providing recommendations, enhancing customer service, and providing real-time analytics without in-house data scientists. This is key to unlocking new forms of value exchanges with consumers in a hyperconnected world.

Image and object recognition

Image recognition finds patterns in visually represented data, pictures, and objects. Facebook and Google are two organizations focused on AI research and solutions in this area.

As image recognition is extended into video and live broadcasts, it will redefine contextual relevance, categorization, and automation of content distribution.

Combined with the advancement of cameras, image recognition and machine learning are transforming the way we process data, including much more than just attitudes and behaviors.

Brand marketers can now leverage images, facial expressions, body gestures, and data collected from IOT-enabled devices to understand the triggers behind behavior and build experiences that anticipate their customer’s needs. This requires brand marketers to transform their data strategy to expand beyond first- and third-party data to also incorporate unstructured datasets that capture affect and unconscious data inputs.

Snap’s pending patent on object recognition is potentially game changing. A recent patent application shows its desire to built object recognition into snaps that can enhance recommended and sponsored filters most likely powered by an AI-based system. This showcases how any object can be aligned with creating immediate context with a consumer and brand.

Olay launched an AI-powered Skin Advisor that ingested user generated photos and provided recommendations for suitable products.

Dynamic experience generation

AI-based systems not only have the ability to parse through large data sets and offer predictive solutions, but also can drive the creation of dynamic experiences. AI will become a powerful tool for creating vs. analysis.

Many startups are leveraing AI APIs to create intelligent solutions. The Grid (https://thegrid.io) is leveraging AI to automate web design with Molly. Molly analyzes design decisions and creates new web experiences.

Eventually, AI will be a key driver of creating augmented reality experiences. Dynamic experience generation through AI will recreate physics, recognizing gestures and movements that can generate new consumer experiences.

Below, Mark Zuckerberg discusses the future of AR/VR at Facebook’s F8 conference.

The various subsets of artificial intelligence will continue to be interconnected, redefining how we approach connecting with consumers. AI makes it possible to know the consumer better than ever before. If approached correctly, with the right mix of AI subsets leveraged, companies will see their business grow.

This is a repost of my recent iMedia cover story.

Follow Tom Edwards @BlackFin360