Trends for 2018 and Beyond

It’s my favorite time of year. No, I am not talking about Santa & reindeer, I am talking about predictions!

Here is a full analysis of potential trends to consider heading into 2018 via a non-linear or choose your own adventure video format just click the image below.

This video starts with a brief intro, then you will have the option to select a specific section of interest or watch it straight through from beginning to end.

In 2017 I spoke a lot about the evolution of experience through the E^3 Innovation To Reality series.  2018 will bring us closer to a convergence of technology that empowers, intelligent systems that enhance, and the line between physical and digital reality will continue to blur.

EMPOWER – Highlighting everything from camera as a platform, virtual reality gets social, evolution of social messaging, physical experiences that enhance digital, immersive eSports, expansion of contextual commerce and more, this section dives into technology that empowers consumers.

ENHANCE – How AI & intelligent systems are accelerating the evolution of experience. From the democratization of AI, device based machine learning, blockchain put to use, visual discovery, voice + visual conversational experiences, and personalized audio.

ENVIRONMENT – How our world is shifting towards synthetic reality with the convergence of location, computer vision and mixed reality that will reshape how we interact with the physical world. This section includes pervasive robotics, biometric security, augmented art, mixing reality, connected intelligence, and brain controlled interfaces.

EXPERIENCE – This section highlights the convergence of empower, enhance and environment. With topics discussed such as Gen Z = enhance vs. create, system based marketing, mobile disruption, ambient computing, and synthetic reality.

Download the complete 2018 trends presentation. 

Follow The Epsilon Agency Innovation Team:

Tom Edwards @BlackFin360

Steve Harries @Steve_Harries22

Ian Beacraft @IanBcraft

Jeremy Olken @JeremyOlken

 

Podcast – Disruption Is the New Normal

I recently had the privilege to join the Up & Out with Connie podcast discussing how disruption is the new normal. Up & Out is featured on iHeartRadio, SirusXM and C-Suite Radio. You can listen here.

I discuss emerging technologies impact on consumer behavior and the acceleration of disruption and the role of the empowered consumer.

We also discuss the shift from content to contextual marketing and the role of data, specifically the data of culture & data of identity as well as the role of intelligent systems, augmented intelligence and artificial intelligence.

We end the discussion with how we interact with artificial intelligence, how I work with start-ups and then personal questions such as a favorite quote, how I started in technology, discussing failure that leads to success and more.

You can listen to the Podcast on C-Suite Radio here
You can download the episode via Apple
Listen via iHeartRadio

Follow Tom Edwards @BlackFin360

BlackFin360 on Amazon Alexa

I received great news this morning that the BlackFin360 Alexa Skill passed certification and was published to the skill store.

Now you can access key insights and information via Alexa by enabling the BlackFin360 skill. Simply search “BlackFin360” on the amazon home page or enable via the Amazon Skills Store here.

Screenshot 2017-07-13 10.37.06

Some of the key intents of the skill are as follows:

In the coming weeks I will be adding an RSS feed of the blog content and evaluating YouTube playlists and the Echo Show. I would appreciate if you could enable and rate the skill. Thanks in advance.

Follow Tom Edwards @BlackFin360
Find BlackFin360 via Amazon Alexa

68 Top Trends So Far in 2017

Over the past six months, my team and I have evaluated the top emerging technology trends that will fundamentally reshape how marketers will connect with consumers.

Here is a brief preview:

The full analysis includes 68 trends categorized by our trend framework of Empower, Enhance, Feel & Ambient Computing. This will replace our original framework of Connection, Cognition & Immersion.

Empower to create content, engage and connect through new interfaces and touchpoints.

Enhance your daily life activities and responsibilities through intelligent systems and proxy’s.

Feel emotional experiences like pleasure and excitement delivered through immersive computing.

Ambient computing is the alignment of all three behavioral drivers.

Download the 2017 Midyear Trend Deck Today!

Follow Tom Edwards @BlackFin360

Symposium 2017

I had the privilege to speak and host Epsilon Symposium 2017. With hundreds of clients in attendance, I was tasked with discussing the role of artificial intelligence across Epsilon and Conversant as well as tease examples of emerging technology that my team and I are working on.

Here are the key highlights I discussed during Symposium 2017. Part of my role is evaluating and embracing the latest innovations and determining how they connect to our Epsilon and Conversant solutions.

Whether that’s through conversational and voice based experiences such as Alexa Voice Services, Google Assistant & Siri, or the amazing artificial intelligence work happening here at Epsilon and Conversant or immersive computing such as augmented & virtual reality, that’s bolstered by our data, insights and creative execution.

So the last ten years I have talked about how disruption is the new normal. How emerging technology can impact consumer behavior and what it means for marketers.

Today we are at an inflection point. Where we are seeing the shift from mobile first to AI first. It’s less about disruption and more about acceleration through intelligent systems.

That’s where Epsilon and Conversant’s heritage of aligning data and technology and driving innovation is the key to leveraging whatever the future may bring and where consumers will be.

Within the agency business, we are using Machine Learning to categorize the data of culture along with our data of identity to fuel our creative approach.

From a product perspective, We are also achieving harmony (Pun intended ;) through machine learning and AI through a centralized intelligence hub for decisioning across channels.

Finally, Conversant is at the forefront of integrating AI through machine learning and image recognition to create world-class speed and scale where every 5 minutes, consumer actions across 160M individual profiles lead to over a billion model updates.

The key moving forward is empowering consumers, enhancing solutions through artificial intelligence and creating immersive experiences

Regardless of how the future state shifts and evolves… be it through bots becoming agents on our behalf, the evolution of consumer based journey’s expanding to include system based journey’s or a hyper connected augmented reality future. All of those elements will be highly dependent on Data and decisioning as the foundational element.

Follow Tom Edwards @BlackFin360

Apple WWDC 2017 Full Recap

When I think of Apple, 3 things come to mind: Industrial design of it’s hardware, interoperability across products, and of course millions of apps. After WWDC 2017, I need to add artificial intelligence (AI) enabled experiences, device level privacy and a new focus on augmented reality.

Here is the Full Recap:

AI was a the key theme of WWDC (mentioned 20 times in 2.5 hours). Apple highlighted how both machine learning and deep learning are now integrated across multiple products. From Apple Watch, Siri, facial recognition in photos and even hand written notes in iOS11. AI integrated experiences were one of the more important areas discussed during WWDC. 

WWDC also saw a new hardware launch in the form of the HomePod. HomePod is Apple’s entry into the Smart speaker market. While Siri is integrated into the device it’s to be determined the role it can play for brand marketers as the skills and actions we have begun to depend on in other product ecosystems was surprising absent.

Apple is also investing heavily into enabling augmented reality experiences through hardware and software. With the launch of ARKit, their strategy is to empower the millions of developers to take their AR building blocks and create immersive experiences that are closely mapped to the real world via world tracking for both 2D and 3D elements.

Apple is building a foundation for the future built on device level privacy, artificial intelligence, augmented reality and multimodal computing through evolving Siri beyond handsets into cars and the home with Homepod.

Here is a quick reaction video following the WWDC Keynote.

Follow Tom Edwards @BlackFin360 ,Ian Beacraft @IanBCraft , Steve Harries @Steve_Harries22

5 Midyear Trends to Watch in 2017

2017 has seen a rapid acceleration of technology trends. Of the 50+ trends observed from CES, MWC, SXSW, F8, Google I/O and more, here are the top 5 midyear trends that I am closely monitoring heading into 2018.

1) MOBILE FIRST TO AI FIRST

For the past few years, Facebook, Google and other industry heavy weights have proclaimed to be mobile first organizations. Now at the midpoint of 2017 we are seeing shifts from mobile first to AI first. Google recently announced their intent to redefine their core products around AI research, tools and applied AI.

Through 2017 Machine Learning (ML) and Artificial intelligence (AI) are rapidly transforming business, products and services. A primary fuel for ML/AI is data. Understanding how to create actionable data centric AI experiences is critical to drive growth in 2017 and beyond.

2) MULTIMODAL INTERFACES

Conversational experiences have been a primary topic of discussion in 2017. From bots to voice based experiences, to computer vision and object recognition, expanding solutions beyond mobile and desktop has been a major trend through the first part of the year.

The shift towards AI first means text and visual tied to mobile and desktop are not enough to evolve the future of interaction.  As 2017 continues to unfold, we will see more voice + paired visual experiences come to market where voice is driving a visual companion experience to further enhance Alexa Skills and Google Actions.

3) CAMERA AS A PLATFORM

As marketers begin to shift their attention from Millennials to Gen-Z, strategies in the first half of 2017 are shifting towards leveraging the camera as a platform.

From Snapchat’s ever evolving lenses to Facebook’s newly announced Frames & AR studios, major industry players are taking a core native behavior that is all about empowerment for the consumer and building new solutions that will integrate real-time data, location and object recognition to create new forms of effect based marketing.

4) RISE OF THE PROXY WEB

The first part of 2017 has shown the first major steps towards the rise of the proxy web. The proxy web is predicated on systems taking over core day-to-day human functions and becoming agents on our behalf. One of the big steps towards this in 2017 was the recent launch by Google of Google Lens.

Google Lens combines the power of Google Assistant and provides the ability to overlay computer vision, which will serve as the basis for contextual augmented reality that links to various services, from purchasing, to content, to predictive reservations based on traffic and other environmental factors. Voice has led the way in 2017, 2018 will be the year of computer vision powered experiences.

5) DEMOCRATIZATION OF IMMERSIVE COMPUTING (VR/AR)

One of the drawbacks to mass adoption of virtual reality has been tied to how isolating an experience can be with limited abilities to share “what’s happening” Both Google and Facebook realize that adoption is closely to accessibility and the ability to share experiences. 2017 has seen a major shift towards the driving the democratization of virtual reality.

The key to driving adoption at scale is to empower consumers, developers and other 3rd parties to create experiences. From empowering the creation of user generated 360 degree content to co-viewing, casting, capturing and sharing VR content. It’s important for brand marketers to pay attention to how consumers interact with these experiences and the rate at which they are creating their own virtual content.

Follow Tom Edwards @BlackFin360

Google I/O 2017 Full Recap

This week I had the opportunity to attend the Google I/O conference in Mountain View, California. It was an incredibly compelling event as Google shifted their focus as a company from mobile first to AI first. This means that all products will be redefined and enhanced through various forms of AI.

This includes the Google Assistant, which was the star of the show. The deck goes into detail, but it’s incredibly important that we begin thinking about the role that the Google Assistant plays across home, smartphone, wearables, auto and soon AR. With the launch on the iPhone announced at the conference it gives Assistant 200 million voice enabled devices out of the gate.

What is also key to consider is the Google Assistant equivalent of an Alexa Skill, called an Action by Google. Actions can support transactions outside of Amazon as well as not requiring installation. Also, there is a very small number of actions that exist today, but a huge and rapidly growing ecosystem of devices that are Google Assistant enabled.

Here is the full trend recap and analysis:

Section one covers trends tied to connection & cognition:

  • Vision of Ubiquitous Computing
  • Multi-Modal Computing
  • Google Assistant (Actions, Auto, Computer Vision, Wear)
  • Android O
  • Progressive Web Apps
  • Structured Data & Search

Section two covers all facets of immersive computing:

  • Immersive Computing
  • Daydream (Virtual Reality)
  • Social VR
  • WebVR
  • Visual Positioning Services
  • Tango (Augmented Reality) 
  • WebAR

In addition to the attached recap, there is also a 4 minute “light recap” video:

For third party commentary, discussed the role of Google Lens & Computer Vision with AdExchanger here

Follow Tom Edwards @BlackFin360

Facebook F8 Full Recap & Analysis

I look forward to Facebook’s F8 developer conference each year. It’s a great opportunity to see how Facebook is prioritizing and adjusting their 10 year road map based on shifting consumer behavior and new advancements in technology. 

What was fascinating about this years conference is the rate they are accelerating the convergence of technologies that connect us, immerse us into new virtual worlds and advancing innovation well beyond what we would expect from a company that identifies itself as social first.

Facebook wants to redefine how we think about reality and the not too distant future when all reality is augmented and virtual. The following provides analysis across the consumer centric filters of connection, cognition and immersion.

  • Connection – Trends that reimagine how we connect, enable and empower consumers
  • Cognition – Trends where machine based intelligence will disrupt and redefine data assets and how we work
  • Immersion – Trends that align technology and presence to evoke emotion, entertain and power commerce

Here are few examples of the 15 territories analyzed starting with:

The Camera as the First Augmented Reality Platform  – Facebook understands that in order to truly create scale the key is to empower consumers, developers and other 3rd parties to create experiences on their behalf.  Consumer empowerment is powerful and will accelerate adoption and ultimately influence consumer behavior towards a new normal.



The democratization of augmented reality (AR) powered by advancing artificial intelligence (AI), has the potential to redefine advertisers approaches to content marketing, making it less about content and more about enabling experiences through compelling and contextually relevant effects.

Frames & AR Studio – Two sets of tools comprise the new Camera Effects Platform. The Frames Studio allows for quick deployment and creation of effects that can enhance an image, video or even Facebook live stream. This platform allows artists, creators and brands to create frames that can be targeted using Facebook targeting abilities for distribution.

The AR Studio is where it’s possible to create light weight AR effects that can developed and enhanced with elements such as real-time data to build highly contextual AR experiences. This is where brand marketers have an opportunity to align data + experiences.

Gaming & eSports

Convergence of gaming & video has been a massive trend over the past 24 months. 2B people play games each month. The rise and consumption of game streams now consists of 665M people watching people play games.

On Facebook people watch, play & create. Facebook’s gaming video product supports eSports (14-31% of live gaming consumption), developers, gaming entertainers and social connection for consumers of game stream content. 

Gaming content is digitally native baked in real time interactivity. With gaming video the audience is more than a spectator. They participate in the experience via comments and getting involved in the gameplay.

Messenger 2.0 – 2016 was considered the year of the bot. Primarily fueled by Facebook’s Messenger beta which accelerated the development of a bot ecosystem to further enhance the Messenger experience.

In 2017, Facebook is positioning Messenger as Messenger 2.0 with a sharp focus on integration of other services via chat extensions giving 3rd party bots the ability to seamlessly connect other services such as Spotify or Apple Music.

Facebook is also keen on driving discovery among the 100,000 bots now on the platform via the new discover tab.

Data Design & Artificial Intelligence 

Facebook is focused on leveraging multiple facets of Artificial Intelligence to power their products and accelerate 3rd party ecosystems.

Computer vision, natural language processing, and algorithms drive content discovery and their newly launched AR experiences. AI is now a foundational element to Facebook’s go-to-market strategy.

Facebook’s ultimate goal is to develop intelligent systems that go beyond computer vision and truly understand the world. This will then converge with their vision of an AR driven future to create a unified experience.

The Rise of Proxy’s – In the very near future we as consumers will have intelligent systems serving the role of a proxy. Facebook is betting on M to first serve as a virtual assistant that will eventually become a predictive service that is the foundation for their virtual computing future.

M will integrate into multiple facets of a users life from sharing location to recommendations. In the near future M can become the connection between a recommendation and AR object recognition action.

Virtual Reality & Facebook Spaces – Facebook officially launched Spaces for Oculus. This was first teased at F8 last year and the experience has definitely advanced from the grainy avatars from a year ago.

Facebook took research and learnings from Oculus Rooms via the Samsung Gear and refined an experience that lets your virtual avatar interact with Facebook content and friends in a virtual environment.

From virtual selfies to watching 360 video. It’s very clear to see that Facebook is focused on creating a new for of social interaction via a virtual environment.

The Future – Facebook took the first major step in achieving their 10 year goal of fully immersive augmented reality by launching the camera as their first augmented reality platform.

On day 2 of the conference, they outlined in detail how they view  transparent glasses (deemed more socially appropriate) or some equivalent that is paired with a general artificial intelligence system to enhance our daily lives.

This includes improving memory, cognition, recognition and redefining how we interact with the physical world and collaborate with one another.

Here is the full recap consisting of all 15 territories analyzed plus implications for brand marketers to consider based on the trend identified. 

Follow Tom Edwards @BlackFin360

Advertising Age Marketing Technology Trailblazer

Today Advertising Age announced their 2017 list of top 25 Marketing Technology Trailblazers and I am honored to be included.


Photo by Bradley Taylor, Caprock Studio 

A big thank you to the Epsilon corporate communications team, DGC and Advertising Age judges. I am truly humbled by the inclusion with such a great list of industry innovators.

I am incredibly grateful to my data design strategy and innovation teams. From research, planning, data design, digital strategy, digital experience delivery, social and innovation a huge thank you for all that you do.

Tom Edwards AdAge

Finally, a very special thank you to my amazing wife Cherlyn for supporting all the crazy hours and travel for the past 17 years.

Follow Tom Edwards @BlackFin360

CX Future = Voice + Visual

I have written articles and commented quite a bit about Amazon Alexa and voice based conversational experiences in the media over the past 12 months.

To date there are over 10 million Alexa powered devices in consumer homes and that number is about to increase significantly with Alexa Voice Services integrating in everything from cars such as Ford Sync 3 system to mobile handsets.

Here is an example of Alexa integrated into the Ford Sync 3 system rolling out in various Ford models this fall. 

Regarding Alexa skills, skills are to Alexa like apps are to mobile, when I first met with the Amazon Alexa partner team a year ago there were barely 1,000 skills published. As of today there are over 10,000 with that number continuing to increase.

In addition to skills the shift towards voice based experiences has already begun. In 2014, voice search traffic was negligible. Today it exceeds 10% of all search traffic and virtual assistants exceed 50B voice searches per month.

That number is going to continue to accelerate as it’s projected by 2020 to be over 200 billion searches per month will be done with voice. Quickly voice will be a key horizontal channel and central to a converged user experience.

Screenshot 2017-03-15 21.41.59

What most don’t realize though is that while most experiences today are zero UI/voice only experiences, the next evolution of voice based systems will be voice + paired visual experiences.

This will ultimately be driven by new hardware that integrates screens, but initially will be driven by responsive web experiences that are powered by Alexa and hands free.

Soon virtual assistants such as the Sony XPERIA Agent shown here at MWC 2017 will have integrated screens to enhance voice + visual.

Voice based skills will be able to showcase information visually by aligning the voice intents with visual queues to create a voice controlled experience that is seamless and enhances the experience.

From highlighting dynamic content to video content, an Alexa skill can easily answer a query and showcase solutions that highlight complex solutions or highly visual elements such as what a recipe should actually look like vs. having to visualize it in ones mind.

Visual queues on the page can also enhance what a user can do with Alexa such as highlighting other related intents such as repeat, help, next steps etc… via a responsive web experience.

This is one of the challenges with pure voice experiences as the user doesn’t always know what their options are to to further engage different aspects of a given skill.

Voice + Visual can also enhance long term engagement which is currently the biggest barrier of Alexa experiences. By considering visual + voice content it is feasible to extend into more entertainment mediums that can be controlled and enhanced via voice.

Voice + Visual also has an impact on the type of data that can be gleaned from progressive profiling and opens up new ways to deploy existing content assets into a system based/virtual assistant driven journey.

I have literally seen the future through a first of it’s kind example of voice (Alexa) + visual (Responsive web) and it is mind blowing. I can’t show it publicly yet but it will reframe your approach to voice based strategy.

Will update this post once the 1st voice + paired visual experience skill is published shortly with visuals.

Follow Tom Edwards @BlackFin360

SXSW Interactive 2017 Trend Recap

This past week, over 30,000 digitally centric professionals, including myself and Ian Beacraft, descended on Austin, Texas for SXSW Interactive 2017. Our focus was to meet with key strategic partners, gauge emerging trends, monitor product launches and most importantly create content and POVs.

Content included a comprehensive text based trend recap download, live streaming from the trade show floor as well as a full video recap.

Over the years SXSW was an ideal event to gauge and project consumer centric tech trends. From Twitter empowering consumers in 2007, Foursquare focusing on location in 2009, social proximity with Highlight in 2012 and live streaming via Meerkat in 2015.

2017 focused on the rise of intelligent systems from a content perspective and immersive experiences bridged physical to digital.

Marketing is quickly shifting from disruptive tech to acceleration through intelligent systems. It’s less about the latest app fad, and more about how quickly the combination of data, intelligent systems and smart environments are going to impact consumer behavior in the future.

The technology featured at SXSWi 2017 aligns with my view of the coming intelligence revolution. This revolution will be built on new data types that will simplify complex tasks, predict need states and usher in new forms of computing that will radically alter how we connect with both consumers and intelligent proxies.

The attached event recap highlights trends across our framework of Connection, Cognition, Immersion & Convergence which is building towards enabling the acceleration of the Intelligence Revolution.

Connection – Trends that reimagine how we connect, enable and empower consumers.

  •  How conversational experiences are evolving and the impact that voice based experiences will have on the web
  • How social proximity and personalization have been refined
  • How interactive video is evolving

Cognition – Trends where machine-based intelligence will disrupt and redefine data assets and how we work.

  • Understand the evolution of storytelling through AI and the importance of data design
  • How emotive robotics will serve as a bridge between general assistants of today to the intelligent and more human systems of tomorrow
  • Learn more about the friction between artificial intelligence and intelligence augmentation of humans
  • Learn about the pending intelligence revolution and the role that the Proxy Web will play

Immersion – Trends that align technology and presence to evoke emotion, entertain and power commerce

  • Understand the evolution of immersive and full sensory experiences. From new forms of user interfaces such as light to mixed reality and everything in-between

Here is the download for the SXSW 2017 Trend Recap and Full Recap Video.

Follow Tom Edwards @BlackFin360

Follow Ian Beacraft @Ianbcraft

Share this post on Linkedin

LIVE: SXSW Interactive 2017 Recap

Here is a video recap shot live from the floor of SXSW Interactive 2017 on day 1 on the opening of the tradeshow floor.

The video outlines emerging technology and trends tied to Connection, Cognition and Immersion and touches on key territories such as:

  • Conversational Experiences
  • Emotional Intelligence
  • Artificial intelligence vs. Intelligence Augmentation
  • Mixed Reality
  • The rise of the Proxy Web

Follow Tom Edwards @BlackFin360

LIVE: MWC 2017 Trend Recap

Here is a video recap shot live from the floor of Mobile World Congress 2017 in Barcelona.

The video outlines emerging technology and trends tied to Connection, Cognition and Immersion and touches on key territories such as:

  • Evolution of Conversational Experiences
  • Artificial intelligence and Advancements in Smart Assistants
  • New Types of Interfaces Beyond Mobile
  • The rise of 5G
  • Convergence of Artificial Intelligence and Virtual Reality

Follow Tom Edwards @BlackFin360

In The News: Ad Age Data Design & Alexa

I was recently interviewed by Ad Age discussing the efforts of my data design team and our work with Amazon and the Alexa Skills Kit.

screenshot-2017-02-11-10-46-11

When I first joined the Epsilon agency team I wanted to bridge traditional brand planning, strategy and data science to uniquely assess all of our data sources and build recommendations that leverage the right data to assist planning, strategy development and data-driven insights to support strategy and creative.

Now the agency data design group is comprised of 3 core components: 1) Mapping the data landscape 2) Storytelling through data 3) Consulting & training. My goal with this team is align intelligence from the data, regardless of source, that will inform how we communicate and message with consumers as technology and behaviors evolve and most importantly drive performance.

There are three primary areas of focus for the team:

1) Proprietary data sources & methodologies e.g. Leveraging Epsilon’s structured data

2) Unstructured data sources & methodologies e.g. Finding previously invisible insights by applying machine learning & artificial intelligence to unstructured category data

3) New data sources & methodologies e.g. Uncover new types of data sets that we call affective datasets and how it will impact and reshape how we connect across the consumer journey

screenshot-2016-12-02-14-47-49

Unstructured and New Data sources combined with Epsilon’s proprietary data began to accelerate our processing and analysis capabilities to uncover consumer truths with unstructured data to further fuel our agency’s strategic storytelling and data driven creative leading to an evolution of brand planning.

For the past 12 months my data design team has focused on aligning emerging artificial intelligence systems and algorithms with our structured data assets to combine all of the following elements.

screenshot-2016-12-02-14-52-09

Data Design is the bridge between planning and bleeding edge tools like cognitive computing, artificial intelligence and natural language processing. Ad Age highlighted our approach with Amazon and how we leverage machine learning on amazon.com down to the product SKU level to further inform communication and engagement strategy as well as our team being one of the early adopters of the Alexa Skills Kit (ASK).

screenshot-2017-02-11-10-46-27

Here is an example of data design concepts in action.

screenshot-2017-02-11-10-46-47

Follow Tom Edwards @BlackFin360

Trends To Watch in 2017

Technology is now essential to our daily lives. Accessibility and empowerment has transformed how we connect and communicate. This has led to new forms of user interaction that will usher in the business models of the future.

2017 will be comprised of new types of conversational experiences to connect with consumers. It will see the continued evolution of artificial intelligence and connected systems as well as the rapid rise of third-party ecosystems supporting virtual, augmented and mixed reality.

The following trend deck outlines the evolution of marketing in 2017 through the consumer centric filters of connection, cognition and immersion and is now available for download.

Screenshot 2016-12-02 15.00.44.png

  • CONNECTION – Trends that reimagine how we connect, enable and empower consumers.
    • Examples include: Simplified Conversational Experiences, Pervasive Voice-Based Interfaces, Search and Retrieval to 1:1 Prediction, Affective Datasets and eSports

screenshot-2016-12-02-14-47-49

  • COGNITION – Trends where machine based intelligence will disrupt and redefine data assets and how we work.
    • Examples include: Machine Learning as a Service, Centaur Intelligence, Blockchain & AI

Screenshot 2016-12-02 14.52.09.png

  • IMMERSION – Trends that align technology and presence to evoke emotion, entertain and power commerce.
    • Examples include: Democratization of VR, VR Commerce, Social VR, (Re)Mixed Reality

Screenshot 2016-12-02 14.57.24.png

  • ZONE OF CONVERGENCE – Trends that align elements of connection, cognition and immersion that will redefine consumer engagement.
    • Examples include: Cars as the next Mobile Platform, Holographic Computing, Ambient Computing.

screenshot-2016-12-02-14-55-53

How we consume and interact via digital channels is about to be absorbed and redefined. We believe that 2017 will begin the convergence of connection, cognition and immersion toward an ambient computing future built on new data types that will simplify complex tasks and predict need states vs reacting.

Download the 2017 Trend Predictions Today!

epsilon-top-trends-for-2017

Follow Tom Edwards @BlackFin360

Galactic Cannibalism & The Future of Marketing

I have spoken a lot recently about how disruption is the new normal. I recently heard someone compare the last five years as a “supernova” of disruption in terms of the intensity and velocity of change. 

With the rise of artificial intelligence, conversational & ambient experiences, connected systems and mixed reality on the horizon we are moving well beyond a supernova and are now on the verge of galactic cannibalism.

gc2

Galactic cannibalism is when one galaxy collides with another and there is a subsequent absorption of parts of one into the other. From a consumer marketing standpoint how we consume and interact via digital channels is about to be absorbed and redefined through new advancements in connection, cognition & immersion.

The key point to surviving and thriving is to have a comprehensive data strategy as data assets will serve as the fuel of this shift. Regardless of which galaxies collide a thorough understanding of data, content, experiences and outcomes is a marketing foundation for the future.

Follow Tom Edwards @BlackFin360

Voice Based UI Best Practices

Over the past year I have focused research efforts on the shift towards conversational experiences and what consumers expect. The research has been covered by Adweek and it’s fascinating how open consumers are to engaging and adopting these experiences as long as they are easy to use and are convenient.

Screenshot 2016-07-12 09.49.30

One flavor of conversational experiences is tied to voice based user experiences. I recently visited Amazon HQ in Seattle and wrote about my experience with the newly formed Amazon Alexa partner team and the rise of voice based user experiences.

Since this article published I have seen client interest and demand for voice based concepts and skill creation rise as our brand partners see the potential of voice based systems.

Here is a slide from a recent client presentation. Almost every meeting over the past few months has included discussions around voice based UI.

screenshot-2016-09-08-12-00-55

I strongly believe that we will begin to see a convergence over the next few years where elements that enable connection such as social messaging and voice based conversational user experiences combined with cognitive computing (AI) and immersive experiences such as holographic computing will become interconnected and will redefine how we approach connecting with consumers.

Screenshot 2016-08-29 09.48.40

Voice based experiences will play a key role during this time as our interactions with connected systems and the rise of micro services as a primary mechanism to navigate a hyper connected world will become the new normal.

We will begin to see services such as Alexa Voice Services quickly proliferate throughout 3rd party devices from in home IOT systems to connected vehicles and “skills” will become a key component for how we navigate beyond screens. Estimates already show over 28 billion connected devices by 2019.

Ford

Developing voice based experiences differs greatly from visually driven experiences. Visual experiences provide immediate context and cues to the end user that can guide the user and enhance the experience.

Here are 5 emerging voice UI design patterns the Amazon team and I discussed and subsequent best practices and points to consider when designing voice based skills.

  1. Infinitely Wide Top Level UI

With a mobile user experience, users have the benefit of visual cues that can guide their actions within a given experience. Be it a hamburger menu or on-screen prompts. With Voice based UI the top level of the UI is infinitely wide. Here are a few best practices for building solutions to beyond infinity wide top level.

tumblr_lw0g09a2hf1qaz5oho1_500

Don’t assume users know what to do – It’s important the first time a voice skill is initiated to provide additional detail and tell the user about their what options they have for interacting with your experience.

Expect the Unexpected – Unlike visual interfaces there is no way to limit what users can say in speech interaction. It’s important to plan for reasonable things users might say that are not supported and handle intelligently.

2) Definitive Choices – The key to successful Voice UI design is to make the next consumer action clear. Consumers will not always say what they want so it is incredibly important to map intent beyond the normal function of a skill. An example is how a consumer may end a session. They may utter done, quit, etc… and the skill needs to provide clear action for how to end the session. Here are additional points to consider.

36104599

Make it clear that the user needs to respond – Ask the user a question vs. simply making a statement.

Clearly present the options – Prompts are very important, especially if the question set is an either/or vs. yes/no.

Keep it Brief – Speech is linear and time based. Users cannot skim spoken content like visual content. Quick decisions are key, so voice based prompts should be short, clear and concise.

Avoid too many choices – Make sure choices are clearly stated and do not present more than three choices at a time, avoid repetitive words.

Use Confirmation Selectively – Avoid dialogs that create too many confirmations, but confirm actions of high consequence.

3) Automatic Learning

One of the areas I am most excited about over the next few years is the intersection of artificial intelligence and the ability to apply machine learning and other higher level algorithms to create more personalized experiences. For Voice based UI it is important to understand how sessions can persist over time.

Screenshot 2016-06-15 07.12.22

Obtain one piece of information at a time – Users may not always give all of the information required in a single step. Ask for missing information step by step and focus on a progressive profiling strategy vs. lead capture.

Develop for Time Lapse – It is possible to create skills that allow for sessions to persist with end users. This can be hours or days. This can allow more data to be collected across sessions.

Personalize Over Time – As sessions persist and users interact with skills it is possible to further personalize the experience over time based on previous interactions.

4) Proactive Explanation

With traditional visual design a user can open a web page or a mobile app and the information design shows you what to do. With voice you don’t have a page so having the ability to clearly articulate definitive choices in addition to providing proactive explanations such as tutorials or help are critically important to reduce user frustration.

keep-calm-and-be-proactive-57

Offer help for Complex Skills – If a skill does more than three functions, it is important to not overload a single prompt to the user. Present the most important information first, along with the option of a help session.

Make sure users know they are in the right place – In speech only interactions, users do not have the benefit of visuals to orient themselves. Using “landmarks” tells users that Alexa heard them correctly, orients them in the interaction and helps to instill trust.

Use Re-Promptiong to Provide Guidance – Offer a re-prompt if an error is triggered. This should include guidance on next steps

Offer a way out if the user gets stuck – Add instructions into the help session. “ You can also stop, if you’re done”.

Don’t blame the user – Errors will happen. Do not place blame on the user when errors happen.

5) Natural Dialog

Research shows that people are “voice activated” and we respond to voice technologies as we respond to actual people. This makes the crafting of voice based narratives incredibly important as the dialog needs to be natural, consumable and written for the ear not the eye. Here are a few key points to consider for enhancing natural dialog within a skill.

Japan Emotional Robot

Present information in consumable pieces – Humans only retain a small amount of information that they hear, only present what is absolutely required in order to keep the interaction as short as possible.

Longer lists need to be broken out into three to five items and ask the user if they want to continue after presented with each chunk.

Write for the Ear, not the Eye – The prompts written for voice-forward experiences will be heard, not read, so it’s important to write them for spoken conversation. Pay attention to punctuation.

Avoid Technical & Legal Jargon – Be honest with the user, but don’t use technical jargon that the user won’t understand or that does not sound natural. Add legal disclaimers to the Alexa app for users to read and process.

Rely on the text, not stress and intonation – Use words to effectively convey information. It is not possible to control the stress and intonation of the speech. You can add breaks but cannot change elements such as pitch, range, rate, duration and volume.

Clarify Specialized Abbreviations and Symbols – If an abbreviation such as a phone number or chemical compound is somewhat specialized, ensure to test the text-to-speech conversion to see if additional steps need to be made.

One final takeaway RE: the Alexa voice based system is the proximity to transaction and list creation via Amazon’s core services. This combined with 6 years of development tied to Alexa Voice Services and the rising partner ecosystem are all signals towards the convergence of connection, cognition and immersion.

Follow Tom Edwards @BlackFin360

Amazon Alexa & Voice User Experiences

Since it first arrived at my home nearly a year ago I have been hooked on the the Amazon Echo and the potential of voice based user experiences. This week I spent time in Seattle at Amazon HQ meeting with the Alexa partner team discussing everything from voice UX best practices, skills development for the Alexa and more.

Photo Jul 19, 9 07 00 AM

To recap, the Echo and it’s cloud supported voice based engine Alexa have been in development for the last 6 years. Since it’s initial launch the devices that comprise the echo ecosystem are regularly sold out and based on the nearly 40,000 stellar customer reviews  (4.5 stars) the experience is resonating with it’s users.

Photo Jul 19, 9 09 42 AM

The core of the experience is a combination of automated speech recognition, natural language processing and a cloud based AI that comprise a voice based user experience. Voice UX is another example of a conversational experience and will become pervasive over the next few years.

Photo Jul 19, 9 11 56 AM

As with most artificial intelligence entities, learning new skills is how personalized and contextual experiences will be created. With Alexa It is possible to “teach” alexa new conversational elements and interactions through developing skills.

Photo Jul 19, 9 26 05 AM

An analogy would be when Neo in the Matrix “learns” kung fu through a knowledge/skill upload. In a similar way Alexa may not be able to learn Kung Fu, at least not yet, but it is possible to build highly engaging voice based experiences.

f22c50f29387e1461274eb73ae3a329e97e3aa09ac8dffee9218e017cd6c8b99

Developing Skills for Alexa is one of the quickest ways for brands to connect with the rapidly growing audience that calls upon Alexa to empower their daily lives. Brands such as Dominos and Capital One have already launched skills to capitalize on being the first to own certain invocation phrases. With the Dominos skill a user can order a pizza and track their order through Alexa.

Screenshot 2016-07-21 15.27.44

Skills are comprised of a Skill Interface and a Skill Service. The Skill Interface is how the Voice User Experience is configured. This includes invocation and utterance phrases from the user as well as the mapping of intent schemas scored and resolved by the Skill Service. This is how Alexa is trained to resolve a users spoken word and connect it with a users intent and resolved into action.

Screenshot 2016-07-19 13.30.29

One of the benefits of Alexa is that the experiences can persist beyond a single session. Even though the experiences may seem ephemeral by nature, the fact is Skills can be created that persist across sessions. This could be hours or days.

Screenshot 2016-07-19 11.43.36

The other benefit is that all invocations and interactions are mapped to cards in the Alexa companion app. This is one way that brands can connect a skill interaction with mobile and digital campaigns.

Screenshot 2016-07-19 13.33.01

Other benefits for brands is that it is possible to deep link to skills within the Alexa companion app for those looking to connect omnichannel communication and messaging to drive discoverability of the skill.

One of the key points for brands to consider is the role being “first” can play when it comes to user invocation terms. Brands that align with non-trademarked terms such as “laundry” will be the first in the order of how skills are discovered. This is key as the Alexa engine expands beyond the Echo with Amazon Voice Services.

Photo Jul 19, 9 33 12 AM

Looking to the near future there will be 45 million connected homes by 2017 and connected car penetration will be over 60 million cars by 2020. The role that Alexa will play in the coming years will go well beyond the Echo, Dot, Tap & the Fire Stick and extend into other form factors through the portable Amazon Alexa Voice Service.

Photo Jul 19, 9 07 41 AM (1)

An example is the connected car partnership between Ford & Amazon to further connect Alexa. This is where the platform will create scale across the ever growing IOT ecosystem.

Ford

Future posts will cover emerging trends tied to Voice Based User Experiences such as the infinitely wide top level UI, definitive choices, automatic learning, proactive explanation as well as user punctuation. For additional questions or assistance with Alexa Skills please follow Tom Edwards @BlackFIn360

The Medium Is the Message

This week Adweek published our quantitative research infographic about consumer behavioral shifts tied to social messaging and the types of experiences they are interested in engaging with in both the print and online edition.

Look for this week’s issue of Adweek. Our research is on page 13.

Screenshot 2016-07-12 09.49.17

Screenshot 2016-07-12 09.49.30

With apps like Facebook Messenger, WhatsApp and Snapchat vying with conventional SMS to be the preferred texting method, the line between social media and texting is more blurred than ever. And brands have a real chance to capitalize on this, according to a newly released study by Dallas-based marketing group Epsilon.

“We are on the verge of a transformational moment, as consumer behavior is dictating a shift towards intimacy of sharing content and experiences versus public sharing,” said Epsilon chief digital officer of agency business Tom Edwards. “Messaging apps now boast more active users than social networks, and this shift from social media to social messaging will redefine how we, as marketers, will approach connecting with consumers.”

Medium-is-the-Message

Follow Tom Edwards @BlackFin360

Tom Edwards BlackFin360

 

Facebook F8 2016 Trend Recap

I recently attended Facebook’s F8 developer conference in San Francisco and the event did not disappoint. Mark and the Facebook team outlined their approach to a ten year roadmap, launched the highly anticipated Messenger chat bot beta and showcased their first concepts of a social virtual reality experience.

img_3875

The presentation below covers:

•  The 10 year roadmap analysis

•  The Rise of Chat bots

•  Immersive Experiences & Social VR

The 10 year Roadmap

Facebook Roadmap

This was the 10 year roadmap presented at F8. It follows the lifecycle continuum approach outlined in the previous slide.

Facebook proper is the most mature and has a thriving 3rd party ecosystem as well as a sustainable monetization model.

Messenger has been identified as the next ecosystem with powerful tools that were released at F8 2016 to drive conversational commerce and a new approach to replacing apps..

VR, Connectivity and AI represent the near future for Facebook and Social VR will be a key area to watch. Developing strategies that capitalize on creating value today while experimenting for the future is key.

For analysis on Facebook’s 10 year roadmap including Facebook’s approach to product lifecycle, Facebook proper, the Live video API, approach to connectivity, artificial intelligence and Facebook’s investment in hardware and open platforms view slides 4-12 in the embedded slideshare.

The Rise of Chatbots

With 900M users and over 1 billion messages sent per month, Facebook felt that Messenger has progressed through their continuum approach to product lifecycle and now has hit the inflection point of scale to build out an ecosystem to solidify and sustain Messenger as the go to mobile application.

img_3879

The key is that Messenger will support one bot to many pages. This makes it easy to seamlessly connect brands or services in a portfolio to create compelling and unique experiences that are 1:1.

Since Facebook does not own the mobile hardware or the operating system, they are positioning Messenger threads as a replacement for native apps.

For in-depth analysis of chat bots including an overview, conversational commerce, the send & receive API, wit.ai, discovery within Messenger, promotion and conversational advertising  view pages 14-22 of the embedded slideshare.

In addition to this POV our Epsilon agency team wrote  a comprehensive eBook that launched when Facebook announced the Messenger Beta. The ebook covers the shift from social media to messaging and the role data, chat bots and conversational commerce will play for brands.

Social Shift Toward Messaging

Virtual & Augmented Reality

Facebook states that virtual reality is the next evolution of computing and is heavily invested in the hardware and experiences that will comprise aligning technology with presence.

Photo Apr 12, 10 31 16 AM

During F8 Facebook outlined a path forward for active VR experiences, demonstrated social VR concepts for the first time publicly and identified augmented reality as a viable disruptor for the first time as to date all the conversation has been about VR experiences.

Virtual Reality experiences are coming and the key will be empowering consumers to create their own immersive experiences. Facebook’s long term goal is to create completely virtual experiences that recreate the physical world. For now wave 1 will be avatar based.

For in-depth analysis of virtual reality including an overview of the role of the Gear VR in the ecosystem, Oculus Touch, the first public demo of Facebook’s Social VR concepts and the bets of the future review slides 23-29 of the embedded slideshare.

For more insights and analysis follow Tom Edwards @BlackFin360

2016 Header

The Social Shift Towards Messaging eBook

Today at F8,  Facebook made the formal announcement to beta launch 3rd Party Chat bot support for Facebook Messenger. I have written a few articles on this topic and have consolidated the thinking into an eBook.

Social media—and now social messaging—is a path to understanding and being in a relationship with your customers. Social messaging is poised to become the most direct, direct marketing channel, creating immediate 1:1 conversations with customers.

As consumer behavior shifts toward more intimate forms of communication and away from public sharing, we’re seeing social messaging apps become more popular than networking apps. Social messaging apps are the new lifestyle platforms, where consumers can do everything from booking a vacation or ordering food to checking traffic giving rise to a new form of commerce. 

 

This white paper provides a deep-dive into:

1) Shifting consumer behaviors towards social messaging,

2) The potential impact of these changes driven by chatbots and conversational commerce 

3) Proposed best practices and future considerations.

Download the eBook today!

Social Shift Toward Messaging

Follow Tom Edwards @Blackfin360

2016 Header

In The News: Chatbots & E-Commerce

I was recently asked by ClickZ for commentary about what role chatbots can play for e-commerce.

Screenshot 2016-04-11 16.45.32

Are Chatbots the future or fad?

 I am a believer that chatbots are a key element in the creation of conversational user experiences and will become core to the messaging experience. Chatbots will introduce new interaction models with new rules of engagement and capabilities that will flow seamlessly based on user interactions vs. installing and swapping between multiple apps.

A messenger chatbot ecosystem could rival and ultimately replace traditional app marketplaces and conversational chatbots, be it artificial intelligence or a bot augmented by humans will become the new standard for content delivery, experiences and transactions.

We view messaging apps as the new brand portal, conversational user experiences are the new interface and chatbots are the new apps. What makes this approach unique is it’s permission based, contextually relevant, immediate and native to mobile.

How can brands use chatbots to enhance their ecommerce?

Conversational commerce will be a key value proposition from messaging platforms. Our Epsilon research shows that messaging significantly impacts purchasing behaviors. Notably, consumers take photos, screenshots, and conduct video chats in real time to seek out assistance during their shopping process.

Brands can build bots with topical response decision trees that align with creating seamless paths to products and services. An example is how Sephora recently partnered with Kik to create a bot driven experience that led a customer through a personalized journey that ends with conversion directly within the conversation.

Screenshot 2016-04-11 16.52.24

With Facebook’s upcoming launch of 3rd party chatbot support, they are empowering chatbot developers with tools to create structured messages that include images, descriptions, call-to-action and URL’s to connect conversation to commerce.

The key for brands to understand is that for now Chatbots are domain specific vs. general intelligence. This means that there is an opportunity to capture data upfront to establish a frictionless and personalized experience for consumers.

Follow Tom Edwards @BlackFin360

Thriving Through Digital Disruption

I had the pleasure of speaking during today’s Brand Activation Summit in NYC. I joined an esteemed panel that was comprised of a CEO, CMO and I (CDO) to discuss thriving in the age of digital disruption.

Screenshot 2016-04-07 10.18.59

My topics ranged from the role of the Chief Digital Officer to vertical specific discussions tied to the future of digital. Over the course of an hour I discussed many topics that I have recently written or spoken publicly on including:

It was a great discussion and a highly engaged audience.

BAS16 Tom Edwards

Follow Tom Edwards @BlackFin360

2016 Header

An Emoji Basketball Could Be The Future of Marketing

On March 17th Facebook rolled out a simple update to Messenger just in time for March Madness.

Photo Mar 22, 5 35 47 PM

By simply using the basketball emoji in Messenger a user can play a simple swipe and shoot mini game directly within the Messenger app experience.

Tom Edwards Chat BotPhoto Mar 23, 6 02 08 PM

This very simple integration could very well show the future for how brand marketers can capitalize on activating within the messenger ecosystem. This along with the potential rise of 3rd party chat bots could fundamentally change how we interact with our mobile devices, social media & apps moving forward.

Facebook Messenger has over 800 million users. And in January of this year Social Messaging Apps such as Facebook Messenger  passed Social Networks for the first time when it comes to active users.

facebook messenger

I have written a lot about Facebook’s plans to convert Messenger into a commerce hub and a 3rd party development platform. Next month Facebook is rumored to release their Chat Bot SDK at F8 and that could quickly accelerate a massive shift in behavior.

The basketball emoji example shows how a brand can potentially activate in a contextual way through a conversational UI and activate emoji, stickers and other experiences directly within the messenger experience.

As of today,  43.7 million players worldwide have played the Basketball Messenger mini-game. It hit the 300 million sessions mark just a week after launch, and the game took place in 61 million different conversations on Messenger.

Facebook would join Telegram as the only two Messenger providers that support open 3rd party apps 100%. You can see examples of bot integrations in action as Uber & Lyft are already integrated with Messenger.

Photo Mar 22, 6 39 43 PMPhoto Mar 22, 6 47 58 PM

This move by Facebook would provide scale and a massive audience and I am seeing additional enhancements being made prior to F8 such as the testing of in-line bots before the release of an SDK. This is similar to Telegram & Kik and allows users to connect directly with existing bots.

The example below shows in-line bots for Facebook Chess and Daily Cute.

Photo Mar 22, 6 36 51 PM

A Messenger Chat Bot ecosystem could rival and ultimately replace app marketplaces. Conversational chat bots + AI through messaging could become the new standard for content delivery, experiences and transactions.

Building on the models we have seen in Asia with WeChat and Line, brand marketers will need to rethink the role their brands play to enable conversations, entertainment and convenience through bots vs. how they engage today through social and other channels.

Starbucks

Going back to the Basketball example, this means that brands could theoretically own the activation of unicode emoji as well as custom stickers and experiences. There is also a stickiness to the experience as high scores and other messages are shared between both parties.

Photo Mar 22, 5 41 09 PM

Bots can also reduce the need for whole mobile apps for multiple phone operating systems, offering lower operational costs. Chat will quickly become the mobile portal, just like Google dominates Desktop search, Facebook is looking to dominate Messaging on mobile.

We cannot ignore the shift of consumers to more intimate means of sharing as well as the potential of comprehensive messenger based ecosystem that can allow the delivery of information, rich media, location services, e-commerce and traditional commerce.

I will be on the ground at F8 and will bring live coverage of all of the details if and when Facebook formally announces their 3rd Party Chat Bot SDK.

Follow Tom Edwards @BlackFin360

2016 Header

 

The Power of Conversational User Experiences

Over the years I have built and defined go-to-market strategies for a number of native applications. I enjoy a clean user experience and I am always on the look out for new and compelling ways to connect with consumers.

With that said I am incredibly impressed by the launch of Quartz’s Native IOS app. Instead of an endless stream of news headlines their approach is to simplify the news experience into an emoji driven, text/messaging like conversation that gives the user the illusion that they are in control of the content experience.

Photo Feb 12, 9 20 46 AM

There are three aspects of the experience that I find unique. Below are points to consider that could have application for brand marketers who create heavily content centric experiences.

Conversational Flow – The simplicity and familiarity of the experience makes it very appealing. The user experience (UX) is framed just like a traditional text/messaging conversation.

TheBlackFin - Quartz 1

This immediately provides a feeling of intimacy vs. being presented with a sea of information to wade through. The use of emoji and animated gifs also gives it more of a conversational messenger feel vs. a traditional news/content experience.

User-Controlled Experience – The other aspect of the UX that I really like is the ability to self select the direction of the experience. I have the option to click the emoji driven option that opens the article within the native app or continue down the path of the next article.

This semblance of control is important as psychologically being in an environment that feels safe and gives me the illusion of control is key to gaining attention and deliberate focus to the topics at hand.

Quartz 2

The integration into notifications as a driver for ongoing engagement is key as well. Knowing the experience is more conversational vs. disruptive can potentially lead to longer term engagement.

Conversational Advertising – From a marketing and advertising perspective the format is very interesting. Each story is tied to a user action and a preference signal is given. Over time it could be possible to build a robust progressive profile based on interactions that can lead to a truly personalized experience.

Out of the gate I do not see the Quartz app taking this approach, but that would be a natural next step to continue to refine the offering and potentially have it powered by an AI based system that can quickly parse the data into personalized streams and map “conversational advertising” into the experience.

What I did like about the ad serving within the experience is that it was not disruptive. Once I had completed reviewing the curated selection of content I was then rewarded with an animated gif that again reinforces the conversational aspect and then given a simple advertising message about the app being brought to me by the new MINI Clubman.

Quartz 3 - TheBlackFin

Even though this is a form of native advertising, I am going to call it conversational advertising as we are in the midst of a massive shift from social media to social messaging where consumers are looking for intimate, conversational experiences that are focused on empowering, enabling and enhancing their mobile/digial/social experiences.

Kudos to the Quartz team for delivering a highly conversational approach to information overload and understanding the importance of empowering the consumer.

Follow Tom Edwards @BlackFin360