Facebook F8 Full Recap & Analysis

I look forward to Facebook’s F8 developer conference each year. It’s a great opportunity to see how Facebook is prioritizing and adjusting their 10 year road map based on shifting consumer behavior and new advancements in technology. 

What was fascinating about this years conference is the rate they are accelerating the convergence of technologies that connect us, immerse us into new virtual worlds and advancing innovation well beyond what we would expect from a company that identifies itself as social first.

Facebook wants to redefine how we think about reality and the not too distant future when all reality is augmented and virtual. The following provides analysis across the consumer centric filters of connection, cognition and immersion.

  • Connection – Trends that reimagine how we connect, enable and empower consumers
  • Cognition – Trends where machine based intelligence will disrupt and redefine data assets and how we work
  • Immersion – Trends that align technology and presence to evoke emotion, entertain and power commerce

Here are few examples of the 15 territories analyzed starting with:

The Camera as the First Augmented Reality Platform  – Facebook understands that in order to truly create scale the key is to empower consumers, developers and other 3rd parties to create experiences on their behalf.  Consumer empowerment is powerful and will accelerate adoption and ultimately influence consumer behavior towards a new normal.



The democratization of augmented reality (AR) powered by advancing artificial intelligence (AI), has the potential to redefine advertisers approaches to content marketing, making it less about content and more about enabling experiences through compelling and contextually relevant effects.

Frames & AR Studio – Two sets of tools comprise the new Camera Effects Platform. The Frames Studio allows for quick deployment and creation of effects that can enhance an image, video or even Facebook live stream. This platform allows artists, creators and brands to create frames that can be targeted using Facebook targeting abilities for distribution.

The AR Studio is where it’s possible to create light weight AR effects that can developed and enhanced with elements such as real-time data to build highly contextual AR experiences. This is where brand marketers have an opportunity to align data + experiences.

Gaming & eSports

Convergence of gaming & video has been a massive trend over the past 24 months. 2B people play games each month. The rise and consumption of game streams now consists of 665M people watching people play games.

On Facebook people watch, play & create. Facebook’s gaming video product supports eSports (14-31% of live gaming consumption), developers, gaming entertainers and social connection for consumers of game stream content. 

Gaming content is digitally native baked in real time interactivity. With gaming video the audience is more than a spectator. They participate in the experience via comments and getting involved in the gameplay.

Messenger 2.0 – 2016 was considered the year of the bot. Primarily fueled by Facebook’s Messenger beta which accelerated the development of a bot ecosystem to further enhance the Messenger experience.

In 2017, Facebook is positioning Messenger as Messenger 2.0 with a sharp focus on integration of other services via chat extensions giving 3rd party bots the ability to seamlessly connect other services such as Spotify or Apple Music.

Facebook is also keen on driving discovery among the 100,000 bots now on the platform via the new discover tab.

Data Design & Artificial Intelligence 

Facebook is focused on leveraging multiple facets of Artificial Intelligence to power their products and accelerate 3rd party ecosystems.

Computer vision, natural language processing, and algorithms drive content discovery and their newly launched AR experiences. AI is now a foundational element to Facebook’s go-to-market strategy.

Facebook’s ultimate goal is to develop intelligent systems that go beyond computer vision and truly understand the world. This will then converge with their vision of an AR driven future to create a unified experience.

The Rise of Proxy’s – In the very near future we as consumers will have intelligent systems serving the role of a proxy. Facebook is betting on M to first serve as a virtual assistant that will eventually become a predictive service that is the foundation for their virtual computing future.

M will integrate into multiple facets of a users life from sharing location to recommendations. In the near future M can become the connection between a recommendation and AR object recognition action.

Virtual Reality & Facebook Spaces – Facebook officially launched Spaces for Oculus. This was first teased at F8 last year and the experience has definitely advanced from the grainy avatars from a year ago.

Facebook took research and learnings from Oculus Rooms via the Samsung Gear and refined an experience that lets your virtual avatar interact with Facebook content and friends in a virtual environment.

From virtual selfies to watching 360 video. It’s very clear to see that Facebook is focused on creating a new for of social interaction via a virtual environment.

The Future – Facebook took the first major step in achieving their 10 year goal of fully immersive augmented reality by launching the camera as their first augmented reality platform.

On day 2 of the conference, they outlined in detail how they view  transparent glasses (deemed more socially appropriate) or some equivalent that is paired with a general artificial intelligence system to enhance our daily lives.

This includes improving memory, cognition, recognition and redefining how we interact with the physical world and collaborate with one another.

Here is the full recap consisting of all 15 territories analyzed plus implications for brand marketers to consider based on the trend identified. 

Follow Tom Edwards @BlackFin360

In The News: AdExchanger & F8

I had an opportunity to sit down with AdExchanger during Facebook’s F8 developer conference.

We discussed how Facebook’s new focus on Augmented Reality through camera effects can impact the future of marketing. 

From the creation of effect based advertising and the intersection of artificial intelligence and the role of data. 


Read the full article here.

Follow Tom Edwards @BlackFin360

F8 2017 Live Recap

Coming to you live from San Jose and Facebook’s F8 conference. Here is a recap of day 1. From the camera as the first AR platform to AI & Social VR.

​​

Follow Tom Edwards @Blackfin360

Advertising Age Marketing Technology Trailblazer

Today Advertising Age announced their 2017 list of top 25 Marketing Technology Trailblazers and I am honored to be included.


Photo by Bradley Taylor, Caprock Studio 

A big thank you to the Epsilon corporate communications team, DGC and Advertising Age judges. I am truly humbled by the inclusion with such a great list of industry innovators.

I am incredibly grateful to my data design strategy and innovation teams. From research, planning, data design, digital strategy, digital experience delivery, social and innovation a huge thank you for all that you do.

I also want to thank Richard McDonald and the Epsilon agency leadership team for your continued support. Richard, it was your vision that sold me on joining Epsilon and its one of the best career decisions I have made.

Finally, a very special thank you to my amazing wife Cherlyn for supporting all the crazy hours and travel for the past 17 years.

Follow Tom Edwards @BlackFin360

 

 

CX Future = Voice + Visual

I have written articles and commented quite a bit about Amazon Alexa and voice based conversational experiences in the media over the past 12 months.

To date there are over 10 million Alexa powered devices in consumer homes and that number is about to increase significantly with Alexa Voice Services integrating in everything from cars such as Ford Sync 3 system to mobile handsets.

Here is an example of Alexa integrated into the Ford Sync 3 system rolling out in various Ford models this fall. 

Regarding Alexa skills, skills are to Alexa like apps are to mobile, when I first met with the Amazon Alexa partner team a year ago there were barely 1,000 skills published. As of today there are over 10,000 with that number continuing to increase.

In addition to skills the shift towards voice based experiences has already begun. In 2014, voice search traffic was negligible. Today it exceeds 10% of all search traffic and virtual assistants exceed 50B voice searches per month.

That number is going to continue to accelerate as it’s projected by 2020 to be over 200 billion searches per month will be done with voice. Quickly voice will be a key horizontal channel and central to a converged user experience.

Screenshot 2017-03-15 21.41.59

What most don’t realize though is that while most experiences today are zero UI/voice only experiences, the next evolution of voice based systems will be voice + paired visual experiences.

This will ultimately be driven by new hardware that integrates screens, but initially will be driven by responsive web experiences that are powered by Alexa and hands free.

Soon virtual assistants such as the Sony XPERIA Agent shown here at MWC 2017 will have integrated screens to enhance voice + visual.

Voice based skills will be able to showcase information visually by aligning the voice intents with visual queues to create a voice controlled experience that is seamless and enhances the experience.

From highlighting dynamic content to video content, an Alexa skill can easily answer a query and showcase solutions that highlight complex solutions or highly visual elements such as what a recipe should actually look like vs. having to visualize it in ones mind.

Visual queues on the page can also enhance what a user can do with Alexa such as highlighting other related intents such as repeat, help, next steps etc… via a responsive web experience.

This is one of the challenges with pure voice experiences as the user doesn’t always know what their options are to to further engage different aspects of a given skill.

Voice + Visual can also enhance long term engagement which is currently the biggest barrier of Alexa experiences. By considering visual + voice content it is feasible to extend into more entertainment mediums that can be controlled and enhanced via voice.

Voice + Visual also has an impact on the type of data that can be gleaned from progressive profiling and opens up new ways to deploy existing content assets into a system based/virtual assistant driven journey.

I have literally seen the future through a first of it’s kind example of voice (Alexa) + visual (Responsive web) and it is mind blowing. I can’t show it publicly yet but it will reframe your approach to voice based strategy.

Will update this post once the 1st voice + paired visual experience skill is published shortly with visuals.

Follow Tom Edwards @BlackFin360