Advertising Age Marketing Technology Trailblazer

Today Advertising Age announced their 2017 list of top 25 Marketing Technology Trailblazers and I am honored to be included.


Photo by Bradley Taylor, Caprock Studio 

A big thank you to the Epsilon corporate communications team, DGC and Advertising Age judges. I am truly humbled by the inclusion with such a great list of industry innovators.

I am incredibly grateful to my data design strategy and innovation teams. From research, planning, data design, digital strategy, digital experience delivery, social and innovation a huge thank you for all that you do.

Tom Edwards AdAge

Finally, a very special thank you to my amazing wife Cherlyn for supporting all the crazy hours and travel for the past 17 years.

Follow Tom Edwards @BlackFin360

CX Future = Voice + Visual

I have written articles and commented quite a bit about Amazon Alexa and voice based conversational experiences in the media over the past 12 months.

To date there are over 10 million Alexa powered devices in consumer homes and that number is about to increase significantly with Alexa Voice Services integrating in everything from cars such as Ford Sync 3 system to mobile handsets.

Here is an example of Alexa integrated into the Ford Sync 3 system rolling out in various Ford models this fall. 

Regarding Alexa skills, skills are to Alexa like apps are to mobile, when I first met with the Amazon Alexa partner team a year ago there were barely 1,000 skills published. As of today there are over 10,000 with that number continuing to increase.

In addition to skills the shift towards voice based experiences has already begun. In 2014, voice search traffic was negligible. Today it exceeds 10% of all search traffic and virtual assistants exceed 50B voice searches per month.

That number is going to continue to accelerate as it’s projected by 2020 to be over 200 billion searches per month will be done with voice. Quickly voice will be a key horizontal channel and central to a converged user experience.

Screenshot 2017-03-15 21.41.59

What most don’t realize though is that while most experiences today are zero UI/voice only experiences, the next evolution of voice based systems will be voice + paired visual experiences.

This will ultimately be driven by new hardware that integrates screens, but initially will be driven by responsive web experiences that are powered by Alexa and hands free.

Soon virtual assistants such as the Sony XPERIA Agent shown here at MWC 2017 will have integrated screens to enhance voice + visual.

Voice based skills will be able to showcase information visually by aligning the voice intents with visual queues to create a voice controlled experience that is seamless and enhances the experience.

From highlighting dynamic content to video content, an Alexa skill can easily answer a query and showcase solutions that highlight complex solutions or highly visual elements such as what a recipe should actually look like vs. having to visualize it in ones mind.

Visual queues on the page can also enhance what a user can do with Alexa such as highlighting other related intents such as repeat, help, next steps etc… via a responsive web experience.

This is one of the challenges with pure voice experiences as the user doesn’t always know what their options are to to further engage different aspects of a given skill.

Voice + Visual can also enhance long term engagement which is currently the biggest barrier of Alexa experiences. By considering visual + voice content it is feasible to extend into more entertainment mediums that can be controlled and enhanced via voice.

Voice + Visual also has an impact on the type of data that can be gleaned from progressive profiling and opens up new ways to deploy existing content assets into a system based/virtual assistant driven journey.

I have literally seen the future through a first of it’s kind example of voice (Alexa) + visual (Responsive web) and it is mind blowing. I can’t show it publicly yet but it will reframe your approach to voice based strategy.

Will update this post once the 1st voice + paired visual experience skill is published shortly with visuals.

Follow Tom Edwards @BlackFin360

SXSW Interactive 2017 Trend Recap

This past week, over 30,000 digitally centric professionals, including myself and Ian Beacraft, descended on Austin, Texas for SXSW Interactive 2017. Our focus was to meet with key strategic partners, gauge emerging trends, monitor product launches and most importantly create content and POVs.

Content included a comprehensive text based trend recap download, live streaming from the trade show floor as well as a full video recap.

Over the years SXSW was an ideal event to gauge and project consumer centric tech trends. From Twitter empowering consumers in 2007, Foursquare focusing on location in 2009, social proximity with Highlight in 2012 and live streaming via Meerkat in 2015.

2017 focused on the rise of intelligent systems from a content perspective and immersive experiences bridged physical to digital.

Marketing is quickly shifting from disruptive tech to acceleration through intelligent systems. It’s less about the latest app fad, and more about how quickly the combination of data, intelligent systems and smart environments are going to impact consumer behavior in the future.

The technology featured at SXSWi 2017 aligns with my view of the coming intelligence revolution. This revolution will be built on new data types that will simplify complex tasks, predict need states and usher in new forms of computing that will radically alter how we connect with both consumers and intelligent proxies.

The attached event recap highlights trends across our framework of Connection, Cognition, Immersion & Convergence which is building towards enabling the acceleration of the Intelligence Revolution.

Connection – Trends that reimagine how we connect, enable and empower consumers.

  •  How conversational experiences are evolving and the impact that voice based experiences will have on the web
  • How social proximity and personalization have been refined
  • How interactive video is evolving

Cognition – Trends where machine-based intelligence will disrupt and redefine data assets and how we work.

  • Understand the evolution of storytelling through AI and the importance of data design
  • How emotive robotics will serve as a bridge between general assistants of today to the intelligent and more human systems of tomorrow
  • Learn more about the friction between artificial intelligence and intelligence augmentation of humans
  • Learn about the pending intelligence revolution and the role that the Proxy Web will play

Immersion – Trends that align technology and presence to evoke emotion, entertain and power commerce

  • Understand the evolution of immersive and full sensory experiences. From new forms of user interfaces such as light to mixed reality and everything in-between

Here is the download for the SXSW 2017 Trend Recap and Full Recap Video.

Follow Tom Edwards @BlackFin360

Follow Ian Beacraft @Ianbcraft

Share this post on Linkedin

In The News: Campaign Live SXSW 2017

I was recently asked by Campaign Live about my thoughts, reactions and takeaways from SXSW Interactive 2017.

My commentary focused on the shift towards programming vs. experiences at this years event.

Additional Context to the Article Commentary:

2017 may be the year that programming both from an official and 3rd party standpoint was the focal point vs experiences. In previous years you would see major brand installations from the sponsors featuring a mix of products and technology. A lot of traditional SXSW powerhouses such as AT&T, Samsung and Chevy were noticeably absent. 

This year more experiences also featured content tracks. The feel was less amusement park and more like attending TED talks with live demonstrations thrown in. It was an odd feeling as the best word to describe SXSW Interactive this year was subdued. 

SXSW used to be the ideal event to gauge and project consumer behavior-centric tech trends. We saw consumer empowerment and amplification with the launch of Twitter in 2007. We saw the rise of location based engagement with Foursquare in 2009. We saw the rise of live streaming service Meerkat in 2015, and a slew of other disruptive tech over the years. But marketing is quickly shifting from disruptive tech to acceleration through intelligent systems. 

Now It’s less about the latest app fad, and more about how quickly the combination of data, intelligent systems and smart environments are going to fundamentally shift how we interact. This is where SXSW is at a cross-roads moving forward.

Follow Tom Edwards @BlackFin360

LIVE: SXSW Interactive 2017 Recap

Here is a video recap shot live from the floor of SXSW Interactive 2017 on day 1 on the opening of the tradeshow floor.

The video outlines emerging technology and trends tied to Connection, Cognition and Immersion and touches on key territories such as:

  • Conversational Experiences
  • Emotional Intelligence
  • Artificial intelligence vs. Intelligence Augmentation
  • Mixed Reality
  • The rise of the Proxy Web

Follow Tom Edwards @BlackFin360

In The News: SXSW Hope vs Reality

I was recently asked by the Drum to write an op-ed about my hope vs reality heading into SXSW Interactive 2017.

As a digitally progressive marketer, focusing both on current solutions, while keeping a close watch on the future, I am at a crossroads when it comes to identifying the value I receive from SXSW.

Each year, I have high hopes for the event. I look forward to real discussions about key topics driving digital. I want to be inspired by compelling brand experiences that showcase the latest technology, which may be a precursor to new ways to connect, empower, entertain, or all of the above.

My hopes remain high, but I am afraid of the reality, given my experience as a SXSW attendee the past few years. Instead of deep meaningful discussions, the content, especially outside of keynotes, is either too simplified or so generic it lacks any lasting impact. The other issue is that panels are selected for their title, versus their substance, and more often than not, the content is more opinion-based, rather than truth or research based.

The reality has been painful at times. I used to think about SXSW as the ideal event to gauge and project consumer behavior-centric tech trends. We saw consumer empowerment and amplification with the launch of Twitter in 2007.

We saw the rise of location based engagement with Foursquare in 2009. We saw the rise of live streaming service Meerkat in 2015, and a slew of other disruptive tech over the years.

But marketing is quickly shifting from disruptive tech to acceleration through intelligent systems. It’s less about the latest app fad, and more about how quickly the combination of data, intelligent systems and smart environments are going to fundamentally shift how we interact.

You can read the rest of the article on the Drum here.

Follow Tom Edwards @BlackFin360

MWC 2017 – Data Design Speaking Recap

What a great show! Mobile World Congress is when the tech world converges on Barcelona, Spain to discuss the ever expanding domain of mobile. I was excited to attend this years event for three reasons: speaking engagement, conducting tours for media and live streaming on behalf of Epsilon. This post will focus on a comprehensive recap of my panel discussion and pre-session approach.

SPEAKING – I had the opportunity to speak at the Modern Marketing Summit event at Mobile World Congress with the CMO of Aston Martin. The main topic was discussing where he could place bets on emerging tech in the near future. I wanted to put more rigor around the discussion and spent time ahead of the session diving into our proprietary data assets to uncover hidden truths about Aston Martin drivers as the basis for recommendations on where to invest for the future.

IMG_20170227_181430.jpg

One of the teams I lead is called Data Design. We take unstructured data from a given category such as automotive and apply machine learning to process conversation among owners and map key perceptions, occasions and attributes as well as personality. Machine learning directs our quantitative research and then we overlay some of the worlds largest proprietary data assets to map category perceptions and behavior among Aston Martin drivers.

img_20170227_181316

This approach proved impactful as the foundation based on data design allowed for differentiation of opinion through insights that allowed a more seamless transition to discuss the intersection of emerging technology and new behavioral signals that will continue to empower consumers.

I begin mapping future state strategy through the lens of Connection, Cognition & Immersion. 

img_20170227_181326

CONNECTION – Trends and technology that connect us, this can include voice based and conversational experiences such as chatbots. Here are previous posts on Connection.

COGNITION – All facets of artificial intelligence such as Machine Learning, Deep Learning, Neural Networks. Here is a previous post on AI.

IMMERSION – Full sensing and immersive experiences, Virtual, Augmented, Mixed, Merged reality, all of these will have an impact in the near future, and possibly shift entertainment from the back seat to the front. Here are previous Immersion posts

Once I outlined each of the components of the Connection, Cognition & Immersion framework I then recommended that he first begin by laying a foundational data designed strategy to prepare for the pending intelligence revolution.

The Intelligence Revolution will incorporate both reactive and predictive elements in anticipation of the rise of the Proxy Web & System based journeys. All of this is built on a foundation of data + decisioning and will transcend individual technologies.

Here is additional context about the four components of the intelligence revolution:

REACTIVE DATA SETS – Today most consumer centric marketing is based on reactive data. For this panel I began with machine learning based AI to map the psychographics of the Aston Martin user.

PREDICTIVE – Next you will see the rise of predictive algorithms and API’s. This is where you see the combination of reactive datasets and regression analysis and modeling to build towards predictive experiences.

PROXY WEB – This is essential for the most important point to consider which will be the time very soon when the consumer may not be at the center of marketing. The Proxy web is where bots or other intelligent systems will drive predictive discovery driven by vertical and horizontal algorithms. Where the bots become the new DSP’s and IOT based sensors and intelligent environments become the new DMP’s.

SYSTEM BASED JOURNEYS – That will lead to a new type of consumer journey, except this time it is the addition of system based journeys that provide both predictive elements, but also overlay situational awareness across an intelligent environment.

More detail to come on the topic of the Intelligence revolution in a future post.

Follow Tom Edwards @BlackFin360

LIVE: MWC 2017 Trend Recap

Here is a video recap shot live from the floor of Mobile World Congress 2017 in Barcelona.

The video outlines emerging technology and trends tied to Connection, Cognition and Immersion and touches on key territories such as:

  • Evolution of Conversational Experiences
  • Artificial intelligence and Advancements in Smart Assistants
  • New Types of Interfaces Beyond Mobile
  • The rise of 5G
  • Convergence of Artificial Intelligence and Virtual Reality

Follow Tom Edwards @BlackFin360

In The News: Ad Age Data Design & Alexa

I was recently interviewed by Ad Age discussing the efforts of my data design team and our work with Amazon and the Alexa Skills Kit.

screenshot-2017-02-11-10-46-11

When I first joined the Epsilon agency team I wanted to bridge traditional brand planning, strategy and data science to uniquely assess all of our data sources and build recommendations that leverage the right data to assist planning, strategy development and data-driven insights to support strategy and creative.

Now the agency data design group is comprised of 3 core components: 1) Mapping the data landscape 2) Storytelling through data 3) Consulting & training. My goal with this team is align intelligence from the data, regardless of source, that will inform how we communicate and message with consumers as technology and behaviors evolve and most importantly drive performance.

There are three primary areas of focus for the team:

1) Proprietary data sources & methodologies e.g. Leveraging Epsilon’s structured data

2) Unstructured data sources & methodologies e.g. Finding previously invisible insights by applying machine learning & artificial intelligence to unstructured category data

3) New data sources & methodologies e.g. Uncover new types of data sets that we call affective datasets and how it will impact and reshape how we connect across the consumer journey

screenshot-2016-12-02-14-47-49

Unstructured and New Data sources combined with Epsilon’s proprietary data began to accelerate our processing and analysis capabilities to uncover consumer truths with unstructured data to further fuel our agency’s strategic storytelling and data driven creative leading to an evolution of brand planning.

For the past 12 months my data design team has focused on aligning emerging artificial intelligence systems and algorithms with our structured data assets to combine all of the following elements.

screenshot-2016-12-02-14-52-09

Data Design is the bridge between planning and bleeding edge tools like cognitive computing, artificial intelligence and natural language processing. Ad Age highlighted our approach with Amazon and how we leverage machine learning on amazon.com down to the product SKU level to further inform communication and engagement strategy as well as our team being one of the early adopters of the Alexa Skills Kit (ASK).

screenshot-2017-02-11-10-46-27

Here is an example of data design concepts in action.

screenshot-2017-02-11-10-46-47

Follow Tom Edwards @BlackFin360

7 Ways AI will Enhance Marketing

For the past 12 months, my Epsilon team and I have focused on multiple facets of artificial intelligence (AI) with data as the primary fuel that powers key insights. We have leveraged machine learning, natural language processing, predictive APIs, and neural networks to uncover consumer truths that previously would have taken weeks or months to uncover.Having the opportunity to work with comprehensive, boundless proprietary data assets is incredibly exciting. In addition to fueling strategy work, it also drives emotional connections with consumers, bonding them to brands in meaningful ways. It is the future of marketing.Now past the experimentation phase, I can say confidently that AI will be a key driver of technology growth over the next decade and will significantly impact consumer marketing. Initial predictions show the market for AI-driven products and services will jump from $36 billion in 2020 to $127 billion by 2025*. (*Source: BofA Merrill Lynch Global Research Estimates — 2017 the year ahead: artificial Intelligence; The rise of the machines.)Most AI we work with today is categorized as Artificial Narrow Intelligence (ANI). This means that the AI is extremely adept at executing specific tasks.

Right now, there are seven subsets of artificial intelligence, outlined below. Brand marketers can better uncover insights, connect with consumers, and redefine customer experiences using this innovative technology.

Machine learning (ML)

ML uses human coded computer algorithms based on mathematical models. Probability models then make assumptions and/or predictions about similar data sets.

Currently, machine learning can be leveraged as a service to accelerate sentiment analysis and domain-specific insights. It also serves as a foundational element for identifying consumer behavior based on occasions, perceptions, and attributes to construct themes and trends from unstructured data which represents the thoughts, behaviors, and preferences of consumers taken directly from their online activities.

In 2017 and beyond, I expect more third-party providers will offer ML as a cloud service brands and agencies can leverage to transform products and services into smart objects, able to predict needs and preferences.

Machine learning solutions have allowed my team to align our proprietary structured data assets with unstructured data to combine the best of both worlds. This began to accelerate our processing and analysis capabilities to uncover consumer truths within unstructured data to further fuel our agency’s strategic storytelling.

Cognitive computing

Cognitive computing builds on machine learning using large data sets.

The goal is to create automated IT systems that can solve problems without human intervention. Marketing centric cognitive computing solutions can consist of a single, all-encompassing solution, or be comprised of multiple services that build and scale applications over time.

From a marketing application perspective, cognitive computing-based solutions range from customer experience enhancing chatbots to closed loop systems for tracking media performance.

Bank of America recently launched the Erica bot using AI, cognitive messaging, and predictive analytics to further influence consumers’ ability to create better money habits.

Cognitive computing will be key to unlocking the potential of conversational experiences. As ecosystems continue to rise, many of the 30,000 chatbots on Facebook Messenger are powered by AI services.

Facebook’s own M virtual assistant housed within Messenger will soon come out of beta testing and will incorporate cognitive suggestions based on content of a conversation users are having. The goal is to make Messenger-based interactions more convenient, enabling users to access services without leaving the conversational thread within Messenger.

Speech recognition and natural language processing (NLP)

NLP refers to intelligent systems able to understand written and spoken language just like humans, along with reasoning and context, eventually producing speech and writing. NLP plays an essential role in the creation of conversational experiences.

Voice-based experiences, such as Alexa’s voice services (AVS), will become pervasive over the next few years. It is projected that by 2020, 30 percent of web browsing sessions will happen without a screen.* (*Source: Gartner analysts at Symposium/ITxpo 2016.)

The core of the AVS experience is a combination of automated speech recognition, natural language processing, and a cloud-based AI that comprise a voice-based user experience.

As with most artificial intelligence entities, learning new skills is how personalized and contextual experiences will be created. With Alexa, it is possible to “teach” new conversational elements and interactions through developing skills.

Here is an example from Domino’s pizza that allows consumers to order pizza directly through Alexa voice services.

Alexa skill development is one of the quickest ways for brands to connect with the rapidly growing audience that calls upon Alexa to empower their daily lives.

Fitbit is another brand leveraging Alexa-based skills to extend brand engagement. Traditionally Fitbit users depend on an app to visualize their data. With the Fitbit Alexa skill users can get a quick update on the stats that matter the most without the need of a screen.

Deep learning

Deep learning builds on machine learning using neural networks. Neural networks are statistical models directly inspired by, and partially modeled on, biological neural networks such as the human brain. The use of neural networks is what differentiates deep learning from cognitive computing.

Deep learning is currently redefining Google’s approach to search, and search engine optimization (SEO) will never be the same. Previously, Google search results were based on algorithms defined by a strict set of rules and SEO was based on regression models that looked at past behavior to adjust a given strategy.

With the introduction of RankBrain, Google’s machine learning technology, in 2016, search algorithms are now enhanced with artificial intelligence. Google is now processing roughly 15 percent of daily queries by mixing the core algorithms based on each search type.

The system is adept at analyzing words and phrases that make up a search query. It also decides what additional words and phrases carry similar meaning.

Expect the percentage of search queries handled by AI to significantly increase. Marketers will need to rethink site architecture, content, and the signals being sent via backlinks as the systems continue to learn on a query-by-query basis.

Predictive application programming interfaces (APIs)

A predictive API uses AI to provide access to predictive models, or expose access to an ability to learn and create new models.

Fortune 500 company USAA is analyzing thousands of factors to match broad patterns of customer behavior through its intelligent virtual assistant Nina.

As we shift from consumers using technology to technology enhancing consumers, predictive APIs will play a key role in providing recommendations, enhancing customer service, and providing real-time analytics without in-house data scientists. This is key to unlocking new forms of value exchanges with consumers in a hyperconnected world.

Image and object recognition

Image recognition finds patterns in visually represented data, pictures, and objects. Facebook and Google are two organizations focused on AI research and solutions in this area.

As image recognition is extended into video and live broadcasts, it will redefine contextual relevance, categorization, and automation of content distribution.

Combined with the advancement of cameras, image recognition and machine learning are transforming the way we process data, including much more than just attitudes and behaviors.

Brand marketers can now leverage images, facial expressions, body gestures, and data collected from IOT-enabled devices to understand the triggers behind behavior and build experiences that anticipate their customer’s needs. This requires brand marketers to transform their data strategy to expand beyond first- and third-party data to also incorporate unstructured datasets that capture affect and unconscious data inputs.

Snap’s pending patent on object recognition is potentially game changing. A recent patent application shows its desire to built object recognition into snaps that can enhance recommended and sponsored filters most likely powered by an AI-based system. This showcases how any object can be aligned with creating immediate context with a consumer and brand.

Olay launched an AI-powered Skin Advisor that ingested user generated photos and provided recommendations for suitable products.

Dynamic experience generation

AI-based systems not only have the ability to parse through large data sets and offer predictive solutions, but also can drive the creation of dynamic experiences. AI will become a powerful tool for creating vs. analysis.

Many startups are leveraing AI APIs to create intelligent solutions. The Grid (https://thegrid.io) is leveraging AI to automate web design with Molly. Molly analyzes design decisions and creates new web experiences.

Eventually, AI will be a key driver of creating augmented reality experiences. Dynamic experience generation through AI will recreate physics, recognizing gestures and movements that can generate new consumer experiences.

Below, Mark Zuckerberg discusses the future of AR/VR at Facebook’s F8 conference.

The various subsets of artificial intelligence will continue to be interconnected, redefining how we approach connecting with consumers. AI makes it possible to know the consumer better than ever before. If approached correctly, with the right mix of AI subsets leveraged, companies will see their business grow.

This is a repost of my recent iMedia cover story.

Follow Tom Edwards @BlackFin360

In The News: iMedia 7 Ways AI Enhances Marketing Cover Story

This morning my new article 7 ways artificial intelligence will enhance marketing was the cover story for iMedia Connection.

The article reviews seven subsets of artificial intelligence from machine learning, cognitive computing, natural language processing, deep learning, predictive API’s, object recognition and dynamic content generation and how brand marketers can better uncover insights, connect with consumers, and redefine customer experiences using this innovative technology.

screenshot-2017-01-17-16-00-30

Follow Tom Edwards @BlackFin360

In The News: Entrepreneur.com & AI

I recently sat down with Jeffrey Hayzlett of C-suite TV for the first episode of season 7 for Executive Perspectives live.

He recently wrote a piece for Entrepreneur.com outlining 5 business trends that will take off in 2017. Jeffrey referenced our conversation regarding automation of conversational experiences through artificial intelligence.

screenshot-2017-01-11-19-29-54

The infusion of voice-based technology into consumer products, and the ways in which brands are shifting from social media to social messaging strategies were the subject I addressed with Epsilon Chief Digital Officer Tom Edwards, during a recent interview. Edwards told me how “disruption is the new normal” and how chatbots are the next thing chief marketing officers will have to deal with as technologies keep evolving.

For more insight from the discussion here is a link to the full interview.

c-suite-blackfin360

Follow Tom Edwards @BlackFin360

In The News: Marketing Dive & 2017 Trends

I was recently asked by Marketing Dive about how digital marketing will evolve in 2017.

screenshot-2017-01-11-19-25-37

One of the key territories I discussed for this piece was the role artificial intelligence, machine learning and cognitive experiences will play in the near future.

From leveraging machine learning to accelerate sentiment analysis and domain-specific insights to cognitive computing solutions that automate experiences without human intervention to the rise of voice-based user experiences that will continue to expand in 2017 to deep learning that will fundamentally change how brands approach SEO to predictive API’s that will expose access to predictive models to further create seamless experiences for consumers, cognitive and intelligent systems will play a key role in how we approach marketing in 2017,” said Tom Edwards, Chief Digital Officer at the agency within Epsilon.

When asked about social media marketing in 2017:

Marketers will need to shift their strategy from one of personification of the brand to a seamless experience that is about simplifying and predicting needs while also empowering consumers to create their own stories,” said Epsilon’s Edwards.

Follow Tom Edwards @BlackFin360

 

Trends To Watch in 2017

Technology is now essential to our daily lives. Accessibility and empowerment has transformed how we connect and communicate. This has led to new forms of user interaction that will usher in the business models of the future.

2017 will be comprised of new types of conversational experiences to connect with consumers. It will see the continued evolution of artificial intelligence and connected systems as well as the rapid rise of third-party ecosystems supporting virtual, augmented and mixed reality.

The following trend deck outlines the evolution of marketing in 2017 through the consumer centric filters of connection, cognition and immersion and is now available for download.

Screenshot 2016-12-02 15.00.44.png

  • CONNECTION – Trends that reimagine how we connect, enable and empower consumers.
    • Examples include: Simplified Conversational Experiences, Pervasive Voice-Based Interfaces, Search and Retrieval to 1:1 Prediction, Affective Datasets and eSports

screenshot-2016-12-02-14-47-49

  • COGNITION – Trends where machine based intelligence will disrupt and redefine data assets and how we work.
    • Examples include: Machine Learning as a Service, Centaur Intelligence, Blockchain & AI

Screenshot 2016-12-02 14.52.09.png

  • IMMERSION – Trends that align technology and presence to evoke emotion, entertain and power commerce.
    • Examples include: Democratization of VR, VR Commerce, Social VR, (Re)Mixed Reality

Screenshot 2016-12-02 14.57.24.png

  • ZONE OF CONVERGENCE – Trends that align elements of connection, cognition and immersion that will redefine consumer engagement.
    • Examples include: Cars as the next Mobile Platform, Holographic Computing, Ambient Computing.

screenshot-2016-12-02-14-55-53

How we consume and interact via digital channels is about to be absorbed and redefined. We believe that 2017 will begin the convergence of connection, cognition and immersion toward an ambient computing future built on new data types that will simplify complex tasks and predict need states vs reacting.

Download the 2017 Trend Predictions Today!

epsilon-top-trends-for-2017

Follow Tom Edwards @BlackFin360

Galactic Cannibalism & The Future of Marketing

I have spoken a lot recently about how disruption is the new normal. I recently heard someone compare the last five years as a “supernova” of disruption in terms of the intensity and velocity of change. 

With the rise of artificial intelligence, conversational & ambient experiences, connected systems and mixed reality on the horizon we are moving well beyond a supernova and are now on the verge of galactic cannibalism.

gc2

Galactic cannibalism is when one galaxy collides with another and there is a subsequent absorption of parts of one into the other. From a consumer marketing standpoint how we consume and interact via digital channels is about to be absorbed and redefined through new advancements in connection, cognition & immersion.

The key point to surviving and thriving is to have a comprehensive data strategy as data assets will serve as the fuel of this shift. Regardless of which galaxies collide a thorough understanding of data, content, experiences and outcomes is a marketing foundation for the future.

Follow Tom Edwards @BlackFin360

Voice Based UI Best Practices

Over the past year I have focused research efforts on the shift towards conversational experiences and what consumers expect. The research has been covered by Adweek and it’s fascinating how open consumers are to engaging and adopting these experiences as long as they are easy to use and are convenient.

Screenshot 2016-07-12 09.49.30

One flavor of conversational experiences is tied to voice based user experiences. I recently visited Amazon HQ in Seattle and wrote about my experience with the newly formed Amazon Alexa partner team and the rise of voice based user experiences.

Since this article published I have seen client interest and demand for voice based concepts and skill creation rise as our brand partners see the potential of voice based systems.

Here is a slide from a recent client presentation. Almost every meeting over the past few months has included discussions around voice based UI.

screenshot-2016-09-08-12-00-55

I strongly believe that we will begin to see a convergence over the next few years where elements that enable connection such as social messaging and voice based conversational user experiences combined with cognitive computing (AI) and immersive experiences such as holographic computing will become interconnected and will redefine how we approach connecting with consumers.

Screenshot 2016-08-29 09.48.40

Voice based experiences will play a key role during this time as our interactions with connected systems and the rise of micro services as a primary mechanism to navigate a hyper connected world will become the new normal.

We will begin to see services such as Alexa Voice Services quickly proliferate throughout 3rd party devices from in home IOT systems to connected vehicles and “skills” will become a key component for how we navigate beyond screens. Estimates already show over 28 billion connected devices by 2019.

Ford

Developing voice based experiences differs greatly from visually driven experiences. Visual experiences provide immediate context and cues to the end user that can guide the user and enhance the experience.

Here are 5 emerging voice UI design patterns the Amazon team and I discussed and subsequent best practices and points to consider when designing voice based skills.

  1. Infinitely Wide Top Level UI

With a mobile user experience, users have the benefit of visual cues that can guide their actions within a given experience. Be it a hamburger menu or on-screen prompts. With Voice based UI the top level of the UI is infinitely wide. Here are a few best practices for building solutions to beyond infinity wide top level.

tumblr_lw0g09a2hf1qaz5oho1_500

Don’t assume users know what to do – It’s important the first time a voice skill is initiated to provide additional detail and tell the user about their what options they have for interacting with your experience.

Expect the Unexpected – Unlike visual interfaces there is no way to limit what users can say in speech interaction. It’s important to plan for reasonable things users might say that are not supported and handle intelligently.

2) Definitive Choices – The key to successful Voice UI design is to make the next consumer action clear. Consumers will not always say what they want so it is incredibly important to map intent beyond the normal function of a skill. An example is how a consumer may end a session. They may utter done, quit, etc… and the skill needs to provide clear action for how to end the session. Here are additional points to consider.

36104599

Make it clear that the user needs to respond – Ask the user a question vs. simply making a statement.

Clearly present the options – Prompts are very important, especially if the question set is an either/or vs. yes/no.

Keep it Brief – Speech is linear and time based. Users cannot skim spoken content like visual content. Quick decisions are key, so voice based prompts should be short, clear and concise.

Avoid too many choices – Make sure choices are clearly stated and do not present more than three choices at a time, avoid repetitive words.

Use Confirmation Selectively – Avoid dialogs that create too many confirmations, but confirm actions of high consequence.

3) Automatic Learning

One of the areas I am most excited about over the next few years is the intersection of artificial intelligence and the ability to apply machine learning and other higher level algorithms to create more personalized experiences. For Voice based UI it is important to understand how sessions can persist over time.

Screenshot 2016-06-15 07.12.22

Obtain one piece of information at a time – Users may not always give all of the information required in a single step. Ask for missing information step by step and focus on a progressive profiling strategy vs. lead capture.

Develop for Time Lapse – It is possible to create skills that allow for sessions to persist with end users. This can be hours or days. This can allow more data to be collected across sessions.

Personalize Over Time – As sessions persist and users interact with skills it is possible to further personalize the experience over time based on previous interactions.

4) Proactive Explanation

With traditional visual design a user can open a web page or a mobile app and the information design shows you what to do. With voice you don’t have a page so having the ability to clearly articulate definitive choices in addition to providing proactive explanations such as tutorials or help are critically important to reduce user frustration.

keep-calm-and-be-proactive-57

Offer help for Complex Skills – If a skill does more than three functions, it is important to not overload a single prompt to the user. Present the most important information first, along with the option of a help session.

Make sure users know they are in the right place – In speech only interactions, users do not have the benefit of visuals to orient themselves. Using “landmarks” tells users that Alexa heard them correctly, orients them in the interaction and helps to instill trust.

Use Re-Promptiong to Provide Guidance – Offer a re-prompt if an error is triggered. This should include guidance on next steps

Offer a way out if the user gets stuck – Add instructions into the help session. “ You can also stop, if you’re done”.

Don’t blame the user – Errors will happen. Do not place blame on the user when errors happen.

5) Natural Dialog

Research shows that people are “voice activated” and we respond to voice technologies as we respond to actual people. This makes the crafting of voice based narratives incredibly important as the dialog needs to be natural, consumable and written for the ear not the eye. Here are a few key points to consider for enhancing natural dialog within a skill.

Japan Emotional Robot

Present information in consumable pieces – Humans only retain a small amount of information that they hear, only present what is absolutely required in order to keep the interaction as short as possible.

Longer lists need to be broken out into three to five items and ask the user if they want to continue after presented with each chunk.

Write for the Ear, not the Eye – The prompts written for voice-forward experiences will be heard, not read, so it’s important to write them for spoken conversation. Pay attention to punctuation.

Avoid Technical & Legal Jargon – Be honest with the user, but don’t use technical jargon that the user won’t understand or that does not sound natural. Add legal disclaimers to the Alexa app for users to read and process.

Rely on the text, not stress and intonation – Use words to effectively convey information. It is not possible to control the stress and intonation of the speech. You can add breaks but cannot change elements such as pitch, range, rate, duration and volume.

Clarify Specialized Abbreviations and Symbols – If an abbreviation such as a phone number or chemical compound is somewhat specialized, ensure to test the text-to-speech conversion to see if additional steps need to be made.

One final takeaway RE: the Alexa voice based system is the proximity to transaction and list creation via Amazon’s core services. This combined with 6 years of development tied to Alexa Voice Services and the rising partner ecosystem are all signals towards the convergence of connection, cognition and immersion.

Follow Tom Edwards @BlackFin360

Amazon Alexa & Voice User Experiences

Since it first arrived at my home nearly a year ago I have been hooked on the the Amazon Echo and the potential of voice based user experiences. This week I spent time in Seattle at Amazon HQ meeting with the Alexa partner team discussing everything from voice UX best practices, skills development for the Alexa and more.

Photo Jul 19, 9 07 00 AM

To recap, the Echo and it’s cloud supported voice based engine Alexa have been in development for the last 6 years. Since it’s initial launch the devices that comprise the echo ecosystem are regularly sold out and based on the nearly 40,000 stellar customer reviews  (4.5 stars) the experience is resonating with it’s users.

Photo Jul 19, 9 09 42 AM

The core of the experience is a combination of automated speech recognition, natural language processing and a cloud based AI that comprise a voice based user experience. Voice UX is another example of a conversational experience and will become pervasive over the next few years.

Photo Jul 19, 9 11 56 AM

As with most artificial intelligence entities, learning new skills is how personalized and contextual experiences will be created. With Alexa It is possible to “teach” alexa new conversational elements and interactions through developing skills.

Photo Jul 19, 9 26 05 AM

An analogy would be when Neo in the Matrix “learns” kung fu through a knowledge/skill upload. In a similar way Alexa may not be able to learn Kung Fu, at least not yet, but it is possible to build highly engaging voice based experiences.

f22c50f29387e1461274eb73ae3a329e97e3aa09ac8dffee9218e017cd6c8b99

Developing Skills for Alexa is one of the quickest ways for brands to connect with the rapidly growing audience that calls upon Alexa to empower their daily lives. Brands such as Dominos and Capital One have already launched skills to capitalize on being the first to own certain invocation phrases. With the Dominos skill a user can order a pizza and track their order through Alexa.

Screenshot 2016-07-21 15.27.44

Skills are comprised of a Skill Interface and a Skill Service. The Skill Interface is how the Voice User Experience is configured. This includes invocation and utterance phrases from the user as well as the mapping of intent schemas scored and resolved by the Skill Service. This is how Alexa is trained to resolve a users spoken word and connect it with a users intent and resolved into action.

Screenshot 2016-07-19 13.30.29

One of the benefits of Alexa is that the experiences can persist beyond a single session. Even though the experiences may seem ephemeral by nature, the fact is Skills can be created that persist across sessions. This could be hours or days.

Screenshot 2016-07-19 11.43.36

The other benefit is that all invocations and interactions are mapped to cards in the Alexa companion app. This is one way that brands can connect a skill interaction with mobile and digital campaigns.

Screenshot 2016-07-19 13.33.01

Other benefits for brands is that it is possible to deep link to skills within the Alexa companion app for those looking to connect omnichannel communication and messaging to drive discoverability of the skill.

One of the key points for brands to consider is the role being “first” can play when it comes to user invocation terms. Brands that align with non-trademarked terms such as “laundry” will be the first in the order of how skills are discovered. This is key as the Alexa engine expands beyond the Echo with Amazon Voice Services.

Photo Jul 19, 9 33 12 AM

Looking to the near future there will be 45 million connected homes by 2017 and connected car penetration will be over 60 million cars by 2020. The role that Alexa will play in the coming years will go well beyond the Echo, Dot, Tap & the Fire Stick and extend into other form factors through the portable Amazon Alexa Voice Service.

Photo Jul 19, 9 07 41 AM (1)

An example is the connected car partnership between Ford & Amazon to further connect Alexa. This is where the platform will create scale across the ever growing IOT ecosystem.

Ford

Future posts will cover emerging trends tied to Voice Based User Experiences such as the infinitely wide top level UI, definitive choices, automatic learning, proactive explanation as well as user punctuation. For additional questions or assistance with Alexa Skills please follow Tom Edwards @BlackFIn360

Facebook F8 2016 Trend Recap

I recently attended Facebook’s F8 developer conference in San Francisco and the event did not disappoint. Mark and the Facebook team outlined their approach to a ten year roadmap, launched the highly anticipated Messenger chat bot beta and showcased their first concepts of a social virtual reality experience.

img_3875

The presentation below covers:

•  The 10 year roadmap analysis

•  The Rise of Chat bots

•  Immersive Experiences & Social VR

The 10 year Roadmap

Facebook Roadmap

This was the 10 year roadmap presented at F8. It follows the lifecycle continuum approach outlined in the previous slide.

Facebook proper is the most mature and has a thriving 3rd party ecosystem as well as a sustainable monetization model.

Messenger has been identified as the next ecosystem with powerful tools that were released at F8 2016 to drive conversational commerce and a new approach to replacing apps..

VR, Connectivity and AI represent the near future for Facebook and Social VR will be a key area to watch. Developing strategies that capitalize on creating value today while experimenting for the future is key.

For analysis on Facebook’s 10 year roadmap including Facebook’s approach to product lifecycle, Facebook proper, the Live video API, approach to connectivity, artificial intelligence and Facebook’s investment in hardware and open platforms view slides 4-12 in the embedded slideshare.

The Rise of Chatbots

With 900M users and over 1 billion messages sent per month, Facebook felt that Messenger has progressed through their continuum approach to product lifecycle and now has hit the inflection point of scale to build out an ecosystem to solidify and sustain Messenger as the go to mobile application.

img_3879

The key is that Messenger will support one bot to many pages. This makes it easy to seamlessly connect brands or services in a portfolio to create compelling and unique experiences that are 1:1.

Since Facebook does not own the mobile hardware or the operating system, they are positioning Messenger threads as a replacement for native apps.

For in-depth analysis of chat bots including an overview, conversational commerce, the send & receive API, wit.ai, discovery within Messenger, promotion and conversational advertising  view pages 14-22 of the embedded slideshare.

In addition to this POV our Epsilon agency team wrote  a comprehensive eBook that launched when Facebook announced the Messenger Beta. The ebook covers the shift from social media to messaging and the role data, chat bots and conversational commerce will play for brands.

Social Shift Toward Messaging

Virtual & Augmented Reality

Facebook states that virtual reality is the next evolution of computing and is heavily invested in the hardware and experiences that will comprise aligning technology with presence.

Photo Apr 12, 10 31 16 AM

During F8 Facebook outlined a path forward for active VR experiences, demonstrated social VR concepts for the first time publicly and identified augmented reality as a viable disruptor for the first time as to date all the conversation has been about VR experiences.

Virtual Reality experiences are coming and the key will be empowering consumers to create their own immersive experiences. Facebook’s long term goal is to create completely virtual experiences that recreate the physical world. For now wave 1 will be avatar based.

For in-depth analysis of virtual reality including an overview of the role of the Gear VR in the ecosystem, Oculus Touch, the first public demo of Facebook’s Social VR concepts and the bets of the future review slides 23-29 of the embedded slideshare.

For more insights and analysis follow Tom Edwards @BlackFin360

2016 Header

In The News: Chatbots & E-Commerce

I was recently asked by ClickZ for commentary about what role chatbots can play for e-commerce.

Screenshot 2016-04-11 16.45.32

Are Chatbots the future or fad?

 I am a believer that chatbots are a key element in the creation of conversational user experiences and will become core to the messaging experience. Chatbots will introduce new interaction models with new rules of engagement and capabilities that will flow seamlessly based on user interactions vs. installing and swapping between multiple apps.

A messenger chatbot ecosystem could rival and ultimately replace traditional app marketplaces and conversational chatbots, be it artificial intelligence or a bot augmented by humans will become the new standard for content delivery, experiences and transactions.

We view messaging apps as the new brand portal, conversational user experiences are the new interface and chatbots are the new apps. What makes this approach unique is it’s permission based, contextually relevant, immediate and native to mobile.

How can brands use chatbots to enhance their ecommerce?

Conversational commerce will be a key value proposition from messaging platforms. Our Epsilon research shows that messaging significantly impacts purchasing behaviors. Notably, consumers take photos, screenshots, and conduct video chats in real time to seek out assistance during their shopping process.

Brands can build bots with topical response decision trees that align with creating seamless paths to products and services. An example is how Sephora recently partnered with Kik to create a bot driven experience that led a customer through a personalized journey that ends with conversion directly within the conversation.

Screenshot 2016-04-11 16.52.24

With Facebook’s upcoming launch of 3rd party chatbot support, they are empowering chatbot developers with tools to create structured messages that include images, descriptions, call-to-action and URL’s to connect conversation to commerce.

The key for brands to understand is that for now Chatbots are domain specific vs. general intelligence. This means that there is an opportunity to capture data upfront to establish a frictionless and personalized experience for consumers.

Follow Tom Edwards @BlackFin360

Thriving Through Digital Disruption

I had the pleasure of speaking during today’s Brand Activation Summit in NYC. I joined an esteemed panel that was comprised of a CEO, CMO and I (CDO) to discuss thriving in the age of digital disruption.

Screenshot 2016-04-07 10.18.59

My topics ranged from the role of the Chief Digital Officer to vertical specific discussions tied to the future of digital. Over the course of an hour I discussed many topics that I have recently written or spoken publicly on including:

It was a great discussion and a highly engaged audience.

BAS16 Tom Edwards

Follow Tom Edwards @BlackFin360

2016 Header