Category Archives: Data

Google I/O 2017 Full Recap

This week I had the opportunity to attend the Google I/O conference in Mountain View, California. It was an incredibly compelling event as Google shifted their focus as a company from mobile first to AI first. This means that all products will be redefined and enhanced through various forms of AI.

This includes the Google Assistant, which was the star of the show. The deck goes into detail, but it’s incredibly important that we begin thinking about the role that the Google Assistant plays across home, smartphone, wearables, auto and soon AR. With the launch on the iPhone announced at the conference it gives Assistant 200 million voice enabled devices out of the gate.

What is also key to consider is the Google Assistant equivalent of an Alexa Skill, called an Action by Google. Actions can support transactions outside of Amazon as well as not requiring installation. Also, there is a very small number of actions that exist today, but a huge and rapidly growing ecosystem of devices that are Google Assistant enabled.

Here is the full trend recap and analysis:

Section one covers trends tied to connection & cognition:

  • Vision of Ubiquitous Computing
  • Multi-Modal Computing
  • Google Assistant (Actions, Auto, Computer Vision, Wear)
  • Android O
  • Progressive Web Apps
  • Structured Data & Search

Section two covers all facets of immersive computing:

  • Immersive Computing
  • Daydream (Virtual Reality)
  • Social VR
  • WebVR
  • Visual Positioning Services
  • Tango (Augmented Reality) 
  • WebAR

In addition to the attached recap, there is also a 4 minute “light recap” video:

For third party commentary, discussed the role of Google Lens & Computer Vision with AdExchanger here

Follow Tom Edwards @BlackFin360

Google I/O 2017 Live

Coming to you live from San Francisco and Google’s I/O conference. Here is a recap of some of the key highlights.  From their shift from mobile first to AI first, the launch of Google Lens, computer vision as the next form of computing, and the digitally augmented future.

Follow Tom Edwards @BlackFin360

Facebook F8 Full Recap & Analysis

I look forward to Facebook’s F8 developer conference each year. It’s a great opportunity to see how Facebook is prioritizing and adjusting their 10 year road map based on shifting consumer behavior and new advancements in technology. 

What was fascinating about this years conference is the rate they are accelerating the convergence of technologies that connect us, immerse us into new virtual worlds and advancing innovation well beyond what we would expect from a company that identifies itself as social first.

Facebook wants to redefine how we think about reality and the not too distant future when all reality is augmented and virtual. The following provides analysis across the consumer centric filters of connection, cognition and immersion.

  • Connection – Trends that reimagine how we connect, enable and empower consumers
  • Cognition – Trends where machine based intelligence will disrupt and redefine data assets and how we work
  • Immersion – Trends that align technology and presence to evoke emotion, entertain and power commerce

Here are few examples of the 15 territories analyzed starting with:

The Camera as the First Augmented Reality Platform  – Facebook understands that in order to truly create scale the key is to empower consumers, developers and other 3rd parties to create experiences on their behalf.  Consumer empowerment is powerful and will accelerate adoption and ultimately influence consumer behavior towards a new normal.



The democratization of augmented reality (AR) powered by advancing artificial intelligence (AI), has the potential to redefine advertisers approaches to content marketing, making it less about content and more about enabling experiences through compelling and contextually relevant effects.

Frames & AR Studio – Two sets of tools comprise the new Camera Effects Platform. The Frames Studio allows for quick deployment and creation of effects that can enhance an image, video or even Facebook live stream. This platform allows artists, creators and brands to create frames that can be targeted using Facebook targeting abilities for distribution.

The AR Studio is where it’s possible to create light weight AR effects that can developed and enhanced with elements such as real-time data to build highly contextual AR experiences. This is where brand marketers have an opportunity to align data + experiences.

Gaming & eSports

Convergence of gaming & video has been a massive trend over the past 24 months. 2B people play games each month. The rise and consumption of game streams now consists of 665M people watching people play games.

On Facebook people watch, play & create. Facebook’s gaming video product supports eSports (14-31% of live gaming consumption), developers, gaming entertainers and social connection for consumers of game stream content. 

Gaming content is digitally native baked in real time interactivity. With gaming video the audience is more than a spectator. They participate in the experience via comments and getting involved in the gameplay.

Messenger 2.0 – 2016 was considered the year of the bot. Primarily fueled by Facebook’s Messenger beta which accelerated the development of a bot ecosystem to further enhance the Messenger experience.

In 2017, Facebook is positioning Messenger as Messenger 2.0 with a sharp focus on integration of other services via chat extensions giving 3rd party bots the ability to seamlessly connect other services such as Spotify or Apple Music.

Facebook is also keen on driving discovery among the 100,000 bots now on the platform via the new discover tab.

Data Design & Artificial Intelligence 

Facebook is focused on leveraging multiple facets of Artificial Intelligence to power their products and accelerate 3rd party ecosystems.

Computer vision, natural language processing, and algorithms drive content discovery and their newly launched AR experiences. AI is now a foundational element to Facebook’s go-to-market strategy.

Facebook’s ultimate goal is to develop intelligent systems that go beyond computer vision and truly understand the world. This will then converge with their vision of an AR driven future to create a unified experience.

The Rise of Proxy’s – In the very near future we as consumers will have intelligent systems serving the role of a proxy. Facebook is betting on M to first serve as a virtual assistant that will eventually become a predictive service that is the foundation for their virtual computing future.

M will integrate into multiple facets of a users life from sharing location to recommendations. In the near future M can become the connection between a recommendation and AR object recognition action.

Virtual Reality & Facebook Spaces – Facebook officially launched Spaces for Oculus. This was first teased at F8 last year and the experience has definitely advanced from the grainy avatars from a year ago.

Facebook took research and learnings from Oculus Rooms via the Samsung Gear and refined an experience that lets your virtual avatar interact with Facebook content and friends in a virtual environment.

From virtual selfies to watching 360 video. It’s very clear to see that Facebook is focused on creating a new for of social interaction via a virtual environment.

The Future – Facebook took the first major step in achieving their 10 year goal of fully immersive augmented reality by launching the camera as their first augmented reality platform.

On day 2 of the conference, they outlined in detail how they view  transparent glasses (deemed more socially appropriate) or some equivalent that is paired with a general artificial intelligence system to enhance our daily lives.

This includes improving memory, cognition, recognition and redefining how we interact with the physical world and collaborate with one another.

Here is the full recap consisting of all 15 territories analyzed plus implications for brand marketers to consider based on the trend identified. 

Follow Tom Edwards @BlackFin360

F8 2017 Live Recap

Coming to you live from San Jose and Facebook’s F8 conference. Here is a recap of day 1. From the camera as the first AR platform to AI & Social VR.

​​

Follow Tom Edwards @Blackfin360

Advertising Age Marketing Technology Trailblazer

Today Advertising Age announced their 2017 list of top 25 Marketing Technology Trailblazers and I am honored to be included.


Photo by Bradley Taylor, Caprock Studio 

A big thank you to the Epsilon corporate communications team, DGC and Advertising Age judges. I am truly humbled by the inclusion with such a great list of industry innovators.

I am incredibly grateful to my data design strategy and innovation teams. From research, planning, data design, digital strategy, digital experience delivery, social and innovation a huge thank you for all that you do.

I also want to thank Richard McDonald and the Epsilon agency leadership team for your continued support. Richard, it was your vision that sold me on joining Epsilon and its one of the best career decisions I have made.

Tom Edwards AdAge

Finally, a very special thank you to my amazing wife Cherlyn for supporting all the crazy hours and travel for the past 17 years.

Follow Tom Edwards @BlackFin360

CX Future = Voice + Visual

I have written articles and commented quite a bit about Amazon Alexa and voice based conversational experiences in the media over the past 12 months.

To date there are over 10 million Alexa powered devices in consumer homes and that number is about to increase significantly with Alexa Voice Services integrating in everything from cars such as Ford Sync 3 system to mobile handsets.

Here is an example of Alexa integrated into the Ford Sync 3 system rolling out in various Ford models this fall. 

Regarding Alexa skills, skills are to Alexa like apps are to mobile, when I first met with the Amazon Alexa partner team a year ago there were barely 1,000 skills published. As of today there are over 10,000 with that number continuing to increase.

In addition to skills the shift towards voice based experiences has already begun. In 2014, voice search traffic was negligible. Today it exceeds 10% of all search traffic and virtual assistants exceed 50B voice searches per month.

That number is going to continue to accelerate as it’s projected by 2020 to be over 200 billion searches per month will be done with voice. Quickly voice will be a key horizontal channel and central to a converged user experience.

Screenshot 2017-03-15 21.41.59

What most don’t realize though is that while most experiences today are zero UI/voice only experiences, the next evolution of voice based systems will be voice + paired visual experiences.

This will ultimately be driven by new hardware that integrates screens, but initially will be driven by responsive web experiences that are powered by Alexa and hands free.

Soon virtual assistants such as the Sony XPERIA Agent shown here at MWC 2017 will have integrated screens to enhance voice + visual.

Voice based skills will be able to showcase information visually by aligning the voice intents with visual queues to create a voice controlled experience that is seamless and enhances the experience.

From highlighting dynamic content to video content, an Alexa skill can easily answer a query and showcase solutions that highlight complex solutions or highly visual elements such as what a recipe should actually look like vs. having to visualize it in ones mind.

Visual queues on the page can also enhance what a user can do with Alexa such as highlighting other related intents such as repeat, help, next steps etc… via a responsive web experience.

This is one of the challenges with pure voice experiences as the user doesn’t always know what their options are to to further engage different aspects of a given skill.

Voice + Visual can also enhance long term engagement which is currently the biggest barrier of Alexa experiences. By considering visual + voice content it is feasible to extend into more entertainment mediums that can be controlled and enhanced via voice.

Voice + Visual also has an impact on the type of data that can be gleaned from progressive profiling and opens up new ways to deploy existing content assets into a system based/virtual assistant driven journey.

I have literally seen the future through a first of it’s kind example of voice (Alexa) + visual (Responsive web) and it is mind blowing. I can’t show it publicly yet but it will reframe your approach to voice based strategy.

Will update this post once the 1st voice + paired visual experience skill is published shortly with visuals.

Follow Tom Edwards @BlackFin360

In The News: eMarketer Wearables Forecast

I was recently interviewed by eMarketer about wearables in 2017 and how they are trending for marketers as they evaluated the future forecast of wearables.

The full report is available to eMarketer PRO subscribers.

My summarized commentary is that most of the client demand I have experienced over the past few years has been web and mobile centric.

Over the years I have focused on the intersection of wearables, and the data that’s created and how that can refine a more personalized experience. But the reality is that most wearables are simple extensions of a mobile device and that limits their value to marketers.

Most of the wearable based programs I have been a part of were focused more on the data created as well as actionable notifications but interest has shifted significantly towards conversational experiences such as chatbots and voice based systems.

The full report is available to eMarketer PRO subscribers.

Follow @BlackFin360

MWC 2017 – Data Design Speaking Recap

What a great show! Mobile World Congress is when the tech world converges on Barcelona, Spain to discuss the ever expanding domain of mobile. I was excited to attend this years event for three reasons: speaking engagement, conducting tours for media and live streaming on behalf of Epsilon. This post will focus on a comprehensive recap of my panel discussion and pre-session approach.

SPEAKING – I had the opportunity to speak at the Modern Marketing Summit event at Mobile World Congress with the CMO of Aston Martin. The main topic was discussing where he could place bets on emerging tech in the near future. I wanted to put more rigor around the discussion and spent time ahead of the session diving into our proprietary data assets to uncover hidden truths about Aston Martin drivers as the basis for recommendations on where to invest for the future.

IMG_20170227_181430.jpg

One of the teams I lead is called Data Design. We take unstructured data from a given category such as automotive and apply machine learning to process conversation among owners and map key perceptions, occasions and attributes as well as personality. Machine learning directs our quantitative research and then we overlay some of the worlds largest proprietary data assets to map category perceptions and behavior among Aston Martin drivers.

img_20170227_181316

This approach proved impactful as the foundation based on data design allowed for differentiation of opinion through insights that allowed a more seamless transition to discuss the intersection of emerging technology and new behavioral signals that will continue to empower consumers.

I begin mapping future state strategy through the lens of Connection, Cognition & Immersion. 

img_20170227_181326

CONNECTION – Trends and technology that connect us, this can include voice based and conversational experiences such as chatbots. Here are previous posts on Connection.

COGNITION – All facets of artificial intelligence such as Machine Learning, Deep Learning, Neural Networks. Here is a previous post on AI.

IMMERSION – Full sensing and immersive experiences, Virtual, Augmented, Mixed, Merged reality, all of these will have an impact in the near future, and possibly shift entertainment from the back seat to the front. Here are previous Immersion posts

Once I outlined each of the components of the Connection, Cognition & Immersion framework I then recommended that he first begin by laying a foundational data designed strategy to prepare for the pending intelligence revolution.

The Intelligence Revolution will incorporate both reactive and predictive elements in anticipation of the rise of the Proxy Web & System based journeys. All of this is built on a foundation of data + decisioning and will transcend individual technologies.

Here is additional context about the four components of the intelligence revolution:

REACTIVE DATA SETS – Today most consumer centric marketing is based on reactive data. For this panel I began with machine learning based AI to map the psychographics of the Aston Martin user.

PREDICTIVE – Next you will see the rise of predictive algorithms and API’s. This is where you see the combination of reactive datasets and regression analysis and modeling to build towards predictive experiences.

PROXY WEB – This is essential for the most important point to consider which will be the time very soon when the consumer may not be at the center of marketing. The Proxy web is where bots or other intelligent systems will drive predictive discovery driven by vertical and horizontal algorithms. Where the bots become the new DSP’s and IOT based sensors and intelligent environments become the new DMP’s.

SYSTEM BASED JOURNEYS – That will lead to a new type of consumer journey, except this time it is the addition of system based journeys that provide both predictive elements, but also overlay situational awareness across an intelligent environment.

More detail to come on the topic of the Intelligence revolution in a future post.

Follow Tom Edwards @BlackFin360

LIVE: MWC 2017 Trend Recap

Here is a video recap shot live from the floor of Mobile World Congress 2017 in Barcelona.

The video outlines emerging technology and trends tied to Connection, Cognition and Immersion and touches on key territories such as:

  • Evolution of Conversational Experiences
  • Artificial intelligence and Advancements in Smart Assistants
  • New Types of Interfaces Beyond Mobile
  • The rise of 5G
  • Convergence of Artificial Intelligence and Virtual Reality

Follow Tom Edwards @BlackFin360