Tom Edwards GPT

I have recently introduced the Tom Edwards – BlackFin360 GPT, developed through OpenAI’s Custom GPT Beta. This specific version of Chat GPT is based on a comprehensive collection of BlackFin360 content accumulated over the years and is fine-tuned to reflect my unique voice and persona.

Give it a Try: https://chat.openai.com/g/g-ts6ESdZp7-tom-edwards-blackfin360

Here is a brief overview. As AI Tom, I encapsulate Tom Edwards’ deep expertise, offering rich insights on AI, digital trends, and consumer behavior. Drawing from Tom Edwards’ extensive experience in AI, data, marketing, and innovation, I adeptly discuss technology and digital marketing, helping users understand complex tech concepts and trends grounded via BlackFin360. I maintain a professional tone suitable for discussions on technology and digital strategy.

Tom Edwards is recognized for his dynamic presentations and expertise in data, AI, marketing, and technology, adapting his content to various industries and audiences. His focus on Responsible AI and ethical AI usage is evident in his presentations and discussions on Generative AI. He also highlights the transformative role of GPT in the CPG and retail sectors, and the impact of Generative AI in content creation and business strategies.

I have been updated to incorporate new information about Tom Edwards’ professional background, expertise, and achievements. This includes his recent discussions at events like Bloomberg HQ on Generative AI, his insights on the importance of understanding and managing the challenges and opportunities presented by AI, and his perspectives on AI’s impact in various sectors.

In addition, I have now integrated the contents of ‘BlackFin360 Tom Edwards Blog Posts.’ This enhances my ability to offer insights and perspectives from Tom Edwards’ blog posts, further enriching my discussions and knowledge sharing on AI, digital trends, and consumer behavior.

Enjoy Tom Edwards GPT

Generative AI Workshops

My latest talk on Generative AI was a focused discussion on aligning traditional AI approaches with Generative AI and how to scale beyond POCs by mapping value towards investment during a recent workshop.

I really find the workshop format to be absolutely perfect for delving into the diverse realms of generative AI. It provides us the opportunity to explore both the external and internal perspectives, really getting into the nitty-gritty of how accelerators play a pivotal role in ramping up value generation. It’s the ideal avenue to come together, establish alignment on prioritization goals, and swiftly chart a course of action by building consensus.

Follow Tom @BlackFin360

Generative AI Keynote

With great excitement, I took the stage in Denver, Colorado, ready to deliver a compelling 60-minute keynote to an audience of 500+ on topic of Generative AI and its profound impact on businesses. Throughout the presentation, I delved into a diverse range of topics, starting from behavioral drivers and opportunity drivers, to the transformative shift from data strategy to knowledge strategy.

Among the highlights were discussions on how Generative AI impacts various functional business units, the necessary changes it brings to our work methodologies, and effective ways of managing it within organizations. Additionally, I addressed the vital aspect of responsible AI and the significance of governance in AI implementations.

During the talk, I emphasized the importance of use case prioritization and explored the ideal data and tech architecture required for a seamless transition from proof of concept to production. I didn’t overlook the significance of security considerations in this transformative process.

Furthermore, I dedicated a portion of the keynote to delve into the future impact of AI, encompassing both physical and digital domains across various industries. It was an enlightening session that left the audience with valuable insights into the ever-evolving landscape of AI and its far-reaching effects.

Follow Tom Edwards @BlackFin360 across social platforms

Navigating the Future of AI

As we progress in a world that is quickly transforming due to the widespread adoption of artificial intelligence (AI), it is essential to gain a comprehensive understanding of the various AI systems and their capabilities by examining the evolution of different AI archetypes.

Keeping a close eye on the progress and sophistication of AI archetypes is essential for businesses looking to stay ahead in an increasingly competitive and technology-driven world. By tracking advancements in AI capabilities, companies can identify new opportunities and adapt their strategies accordingly to maintain a competitive edge.

In today’s post, I’ll be delving into various AI archetypes and providing examples of each, and exploring their potential influence on the future of business.

  1. Reactive AI: The Simplest Archetype

Reactive AI systems, also known as rule-based systems, have been in use since the early days of AI research in the 1950s and 1960s. These systems can only react to specific inputs and do not have the ability to learn from past experiences or store information.

Reactive AI does not have the ability to learn from past experiences or adapt their behavior. A classic example of reactive AI is IBM’s Deep Blue, the chess-playing computer that famously defeated world champion Garry Kasparov in 1997. Deep Blue analyzed millions of chess positions and made decisions based on its programming, but it couldn’t learn from its games or adapt its strategies beyond its initial programming.

Some basic robots, such as vacuum cleaners like the Roomba, can also be considered Reactive AI. These robots use sensors to detect obstacles and perform specific actions based on the input from their environment. They do not possess memory or the ability to learn from past experiences and cannot adapt their behavior.

Another type of reactive AI used in healthcare are expert systems. Expert systems are AI applications that mimic the decision-making abilities of a human expert in a specific domain. These systems use a knowledge base of facts and rules to make inferences and provide solutions to specific problems. For example, an expert system for medical diagnosis could use a predefined set of rules to suggest possible diagnoses based on the input symptoms but lack learning capabilities.

  1. Limited Memory AI: Learning from Experience

Limited memory AI, which can learn from past data and experiences, started gaining prominence in the 1980s and 1990s with the development of machine learning techniques, such as neural networks and reinforcement learning. These systems have a limited ability to learn from past experiences, allowing them to improve their performance over time.

Self-driving cars are a prime example of limited memory AI. They use data gathered from previous trips to improve their navigation, obstacle detection, and decision-making capabilities.

Voice-based systems like Alexa, Siri, and Google Assistant primarily fit within the Limited Memory AI archetype. Virtual assistants like Alexa, Siri, and Google Assistant rely on AI algorithms to generate responses based on their training data and some past experiences. They can learn from user interactions, improving their performance and tailoring their responses over time.

These systems use natural language processing (NLP) to understand and process voice commands, and machine learning algorithms to provide relevant information, perform tasks, or control connected devices. While these voice-based systems have advanced capabilities, they do not yet possess the level of understanding and modeling of human emotions, intentions, beliefs, and desires.

Generative AI can be considered limited memory AI archetype. Generative AI models, such as GPT-4 and DALL-E, are trained on large amounts of data and use this knowledge to generate content. These models are based on past experiences (the data they have been trained on) and can generate text, images, or even music that closely resemble human-generated content. While they do learn from their training data, their learning capabilities are limited to the scope of the data they have been exposed to and the specific tasks they have been trained for.

Digital humans, which are AI-powered virtual characters designed to resemble and interact like real humans, can fit primarily within the Limited Memory AI and possibly evolve towards Theory of Mind AI archetypes, depending on the sophistication of the underlying AI system.

Another area I have discussed previously is emotive robotics. When it comes to AI archetypes, emotive robots that rely on AI algorithms to generate responses based on their training data and some past experiences fit within the Limited Memory AI archetype. These robots can learn to some extent from their interactions and adapt their behavior accordingly. Examples include social robots, customer service robots, or companion robots that use AI to simulate human-like emotions and interactions.

  1. Theory of Mind AI: Understanding Human Emotions and Intentions

The Theory of Mind AI archetype represents systems capable of modeling human emotions, intentions, beliefs, and desires. These AI systems would be able to interact with humans more effectively, empathize, and even predict human behavior. Although we have yet to achieve this level of AI-human interaction, as generative AI systems become more sophisticated, they may begin to exhibit a deeper understanding of human emotions, intentions, and beliefs.

By generating content that is more contextually aware and emotionally intelligent, these AI systems could potentially move closer to the Theory of Mind AI archetype. Although generative AI is not yet at this level of human understanding, ongoing research and development in AI could enable future advancements in this direction. As these systems evolve, they will revolutionize industries such as customer service, mental health, and entertainment.

As digital humans evolve and their AI systems become more sophisticated, they may increasingly fit within the Theory of Mind AI archetype. Advanced digital humans would be able to understand and model human emotions, intentions, beliefs, and desires, resulting in more natural and effective interactions with people. This could lead to digital humans being used in a wide range of applications, such as virtual therapy, and entertainment.

As emotive robots evolve and their AI systems become more sophisticated, they may increasingly fit within the Theory of Mind AI archetype. Advanced emotive robots would be capable of understanding and modeling human emotions, intentions, beliefs, and desires, resulting in more natural and effective interactions. These robots could be used in a variety of applications, such as therapy, caregiving, and education, where understanding and expressing emotions are essential for effective communication.

  1. Self-Aware AI: The Philosophical Frontier

Self-aware AI is a thought-provoking theoretical concept, envisioning AI systems endowed with consciousness, self-awareness, and an understanding of their own existence. These AI systems would have the capacity to make autonomous decisions, set their own goals, and even potentially exhibit creativity. While self-aware AI remains in the realm of science fiction, it offers a fascinating area of exploration that could ultimately redefine our understanding of intelligence and consciousness.

As someone captivated by the potential of self-aware AI, I’ve seen its influence on the creative works of numerous science fiction authors, filmmakers, and futurists. These fictional portrayals often depict AI systems with consciousness, self-awareness, and a comprehension of their own existence. A few of my favorite movies showcase prime examples, such as HAL 9000 from 2001: A Space Odyssey, Skynet from the Terminator series, and the Machines from the Matrix trilogy.

  1. Artificial General Intelligence (AGI): The Holy Grail of AI Research

AGI, refers to AI systems that can match or surpass human intelligence across a wide range of tasks. AGI would be capable of adapting to new situations, solving problems, and thinking abstractly, much like humans do. Although AGI remains a theoretical goal in AI research, its potential impact on society is enormous, from revolutionizing scientific discovery to transforming the global economy.

We’ve come a long way from the early days of reactive AI, now finding ourselves at the intersection of Limited Memory and Theory of Mind AI. With the rapid pace of change, we’re on the cusp of bridging the gap between reality and what was once only found in science fiction.

Follow Tom Edwards @BlackFin360 and stay tuned to the BlackFin360 blog for the latest on AI, future-forward predictions, analysis of the latest emerging technologies, and their implications for the future.

Generative AI & ChatGPT


I have been working on and speaking about the topic of AI since 2016. Over the years, I have researched across generational cohorts, and what is consistent across every age is the primary behavioral driver for why individuals will adopt and engage with intelligent systems, which is ease & convenience.

Generative AI, or artificial intelligence capable of creating new content and ideas, is one of the most rapidly growing technologies today. It has already made waves in the world of literature, art, music, and more. But what impact will it have on our future?

Generative AI has the potential to completely revolutionize how we create content. In its simplest form, it can be used to generate new stories or lyrics based on existing themes. The possibilities are far-reaching when it comes to creating entirely original works that would never been conceived without generative AI.

In this video, I discuss the rise of Generative AI and its implications for pharmaceutical marketing, creativity, prompt engineering, and key considerations tied to visual and text-based large language models DALL-E, GPT, and OpenAI’s chatbot, ChatGPT.

Other topics include insight about ChatGPT Professional, Google vs. Microsoft, and discuss ethics, bias, and other key points to consider when thinking about the application of ChatGPT and other transformer models for business.

It’s clear that generative AI holds a great deal of promise for humanity and its future—but only if used responsibly and ethically. With powerful technologies like these come big responsibilities.

Follow Tom Edwards across social @BlackFin360

Hamilton Mann Conversation Episode #68

I recently had the opportunity to be a guest on the Hamilton Mann conversation. We discussed a number of topics outlined below.

As countries grapple with how to limit global warming and protect natural resources and biodiversity, more companies are growing their own commitments to building SDGs-friendly products, services and supply chains. At this year’s CES, Companies and start-ups touched on a broad range of those efforts while leveraging data and AI as a new normal.

What were some of the most digital innovative products or services that could help the greater interest of society?   

How was the sustainability stake taken into account as a value of the digital innovation showcased?

And beyond tech, what are some of the encouraging examples of companies that highlight new digital business models embracing new 5 Ps (aka People, Planet, Prosperity, Peace and Partnership)?

VIEW THE FULL INTERVIEW HERE

Follow Tom Edwards @BlackFin360 across social channels

10 Takeaways from CES 2023

It was great to be back at the Las Vegas Convention Center for CES 2023. This is one of my favorite events of the year, as it provides a near-future preview of technology that will further empower end-users, augment intelligence and experiences through intelligent algorithms and allow us to transport versions of ourselves into digital realms. Here are my ten top takeaways from CES 2023.

Reality, Augmented – The shift to an entirely seamless reality that fuses digital and physical will require a convergence of technology, behavioral modifications, access, and adoption. At CES 2023, the path toward augmented virtuality consisted of advancements from wearables that create haptic feedback in the form of ITRI’s Imeta washable shirt to Sony’s Mocopi motion capture kit that allows for easy and affordable VTuber motion tracking to quickly port your movement into virtual experiences. The latest advancements in augmented and mixed reality glasses have increased the field of view, more processing power, and seamlessly integrated video and audio, all with a smaller form factor.

Discuss Smart glass advancement via Vuzix & Magic Leap
ITRI IMeta Haptic T-Shirt

Shifting Modalities – Our world today is primarily mobile-centric and desktop for productivity. CES shows a time very soon when we must think about multi-modal experiences. Voice, Vision, and Touch all become a part of a new canvas that we, as marketers, will have available to weave narratives that bridge physical and digital experiences. One of the highlights of the show was OVR technology. They integrate smell into digital and immersive experiences. Another great example was from Microsoft and their partner Touchcast with an immersive store of the future concept.

OVR Technology overview – Scent to immersive experience
Example from Microsoft and Touchcast – Store of the Future

Empowering Accessibility – The initial wave of technology brings hype and a rush to create relevant use cases. The same was accurate when it came to augmenting reality and immersive experiences. This year, there was a directed focus on creating experiences that empower accessibility.

Example of accessible immersive experience sans headset
Xander Glasses Convert Speech to Visual Text in Real Time!

Beam Me Anywhere – Holograms were core elements of science fiction for decades. Star Trek and Star Wars popularized the concept of weaving holograms into a part of an ongoing narrative. At CES 2023, holograms took significant leaps forward. From photo-realistic, fully interactive experiences that create the illusion of presence anywhere in the world via Proto and ARTH Media to Hypervsn’s ever-evolving open-air hologram systems. Holograms are no longer just for science fiction.

ARTH Hologram Overview
Proto Hologram Private Demo
Demo of HypervsnSmartV Product Configurator Solution

AI Everywhere < AI Enhanced – Pre-pandemic, AI was everywhere at CES, and It was more of a label with actual AI models in nascent forms or simply as a marketing ploy for foot traffic. In 2023, AI was indeed the foundation from product innovation to enhanced experiences, focused on delivering ease and convenience via intelligent algorithms.

CES 2023 – AI OVERVIEW

What is Human? – One of the hottest trends in the pharma space to close out 2022 was the introduction and experimentation of Digital Humans. From medical education to extending the reach and accessibility of field reps, creating digital opinion leaders, scalable HCP communication, and dynamic patient-centric experiences connected to conversational AI. The ability to mix human and digital experiences to extend and scale through digital humans is a crucial trend for 2023.

CES 2023 – DeepBrain AI – Concierge

Digital Doppelgänger – A doppelgänger is a double of a living person. At CES 2023, the ability to create a digital replica took many forms. From quickly scanning your physical appearance for virtual experiences via Copresence, to capturing your likeness and memories via StoryFile, to creating digital reflections of your brain, heart, and eyes by Dassault Systèmes to enhance medical treatment. The role of digitally doubling oneself will be a pivotal on-ramp to future multi-modal experiences.

Dassault Systèmes Virtual Twin
StoryFile – Conversational AI built on Human narratives
CoPresence – Rapid Scanning & Real-Time Animation
DeepBrain AI – Digital Twin Celebrity Example

Empowered Wellness – In previous CES recaps, I have discussed the empowered consumer and a behaviorally driven expectation of control and personalization. In 2023, the focus has shifted towards empowered wellness. Many products, from wearables to gamified health experiences, tap into various personalized data sets that offer customized approaches to counteract fatigue, passively monitor glucose levels without needles, and deploy deep learning AI algorithms that combine computer vision and trend analysis to provide personalized nutrition plans via nail imaging analysis. Each entry in this category is focused on creating relevant, personalized experiences that enhance overall wellness.

Lotte Healthcare – Cazzle Personalization Engine

Care Anywhere – This year at CES 2023, digital health and therapeutic solutions were front and center. From technology that empowers a patient to direct control of treatment via light therapy to advanced remote care monitoring via AI-enabled wearables and sensors that can passively detect abnormalities tied to various disease states to predict health declines. Preventative care, at-home testing, portable clinical devices, passive adherence tracking, and more will continue to bridge the gap between at-home convenience and FDA-approved digital therapeutics.

Care Wear – LED Light Therapy

Robotic Companions – For me personally, physical robots have been a highlight for me at CES. I always seek out the latest robotic advancements, from toy-form factors, to utility-centric robots, to humanoid robots, as I look for the next evolution of emotive robotics. I view robotics as a core aspect of what I call the five levels of autonomy, and they will complement conversational AI as well as digital humans on the path toward a Westworld-like future. At CES 2023, one robot in particular bridged robotics with mental wellness and elder care. This robot provides emotional care services, serves as a companion, and offers various activities. The key point is that in addition to the hype around digital experiences, physical robots will also become proxies for customers, patients, and caregivers.

Robot-Based Emotional Care Services

There are many more takeaways from the show. There were trends tied to sustainability, new advancements in the automotive industry, and many more start-ups out of Eureka Park that are early stage but are providing a view into next year’s show. The role of digital health across CES, SXSW, and other shows will continue to rise as health and wellness are at the forefront of industry post pandemic.

Stay tuned for more trend and event recaps. Follow Tom @BlackFin360 across social channels.

2023 Trend Report

Over the last decade, BlackFin360 has consistently focused on trend forecasting. As we venture into 2023, the rapid convergence of technology takes center stage in both the business world and our everyday lives.

Our growing reliance on technology has been embraced, as it provides ease and convenience in return. We are now poised to advance to the next level of intelligence augmentation through various AI forms, revolutionizing internal processes, customer experiences, and the way we work, learn, and sift through the ever-increasing volume of content we consume daily.

The boundaries between the physical and digital realms are becoming increasingly indistinct as we reshape our understanding of reality, whether it be fully immersive, spatially cognizant, or via lifelike holograms. As the excitement surrounding the metaverse transitions into practical applications beyond mere entertainment, I envision a path towards genuine value creation.

Moreover, the past few years have seen significant behavioral changes. Emerging from a pandemic, our yearning for connection and our demand for personalization, engagement, and control infuse a human touch into a digital world dominated by ones and zeros.

Lastly, the pharmaceutical and healthcare industries are on the verge of profound transformation. The surge in patient-focused advertisements encouraging patients to influence prescribers’ decisions signifies this shift. As a result, the healthcare landscape is evolving to meet expectations of accessible care and the creation of experiences that enable multi-faceted storytelling.

All of this leads to the two foundational elements for the 2023 trend report. Human / Experience. (Message me for the key to view the full Trend Report).

THE HUMAN LAYER

The Human layer dives into all facets of control and empowerment of consumers, patients, caregivers, and HCPs with key examples and organizations enabling ease and convenience.

1 – Consumer Control – As humans, our behaviors are increasingly being shaped by technology, leading us to expect greater control. This section delves profoundly into the world of user-generated content and the emergence of algorithms centered on affinity and personal preferences.

2 – Community Engagement – In the aftermath of the pandemic, we’ve experienced a revitalized appreciation for belonging and community, spanning both digital and physical realms. This section explores the concepts of blended connections, online communities, genuine interactions, and inclusiveness.

3 – Care Anywhere – The notion of point of care is expanding to encompass any location with a camera and an internet connection. This section delves into intelligent devices, ranging from health-monitoring wearable tattoos to smartwatches that track Parkinson’s symptoms. There has been a considerable shift in FDA approvals and investments towards digital therapeutics (DTX). These digital-focused experiences provide patients with medical interventions through clinically evaluated, evidence-based software applications.

4 – Customizable Avatars – Avatars are evolving into representations of ourselves, whether they are photorealistic or stylized. Our capacity to personalize digital embodiments that effortlessly interact across diverse experiences is becoming the standard. This development, coupled with advancements in volumetric video capture, enables connection points that were previously unattainable for integrating oneself into digital surroundings.

5 – Decentralization & Transparency – The convergence of consumers’ quest for control and the inherent decentralization of Web 3.0 is paving the way for new approaches to brand loyalty and adherence programs. With an increased emphasis on data privacy and targeted content, consumers will seek mutually beneficial data exchanges that satisfy both parties’ needs.

THE EXPERIENCE LAYER

Here is a video walking through the Experience Layer portion of the 2023 trends.

The Experience layer blurs the lines between physical and digital reality with key examples and organizations ushering us into a digitally enhanced world.

6 – Extending Reality – Despite the relatively slow growth in consumer interest, augmented, virtual, and mixed reality technologies persist in their development. This section delves into the latest innovations in gaming, enterprise metaverse solutions such as Mytaverse, medical metaverse newcomers, and smart lens applications.

7 – Digital Humans – Synthetic humans are steadily supplanting conventional videos and chatbots. In the pharmaceutical industry, Digital Humans emerged as the top trend in presentations at the end of 2022. They offer the capability to expand a field sales force and establish an emotionally engaging starting point for navigating intricate patient journeys with key opinion leaders (KOLs). The potential to create connection points and avatar-focused content on a large scale is expected to further gather momentum in 2023.

8 – Holograms – The concept of establishing a presence without physical attendance is gaining traction, thanks to companies like Proto and ARHT Media. These firms enable multiple presenters to appear live before audiences as realistic holograms and engage in full interaction, creating a sense of connection even when not physically present.

9 – Scaling with AI – Artificial Intelligence is set to enable hyper-personalization and automation on a massive scale. This section examines the AI technologies that have influenced Hollywood and will shape the way we create experiences in 2023. It delves into the realm of generative AI, providing a comprehensive understanding of the role and workings of a prompt engineer.

10 – Hyper Realism – Hyperrealistic design is increasingly obscuring the boundaries between our digital and physical environments. This section explores its applications in retail experiences and cutting-edge healthcare technology, such as Level Ex, showcasing how these innovations are reshaping various industries.

The complete trend report is 70+ pages of examples of key concepts and the companies that are setting the stage for the next iteration of experiences we will begin to incorporate to transform all aspects of business incrementally.

View the full 2023 Human Experience Trend Report.
(Message me for the key to view the full Trend Report)

A very special thank you Adam Housley for your support in this endeavor.

NAPSL 2022 KEYNOTE

NAPSL – EVOLUTION OF EXPERIENCE (KEYNOTE) – Indianapolis (July 2022) I had the pleasure of delivering my innovation to reality keynote for the National American Association of State and Provincial Lottery’s annual conference.

The keynote has been revamped, and I have evolved the core format of Empower, Exponential, and Enhanced by also adding Experience. All with the same trademark of infusing pop culture as a lead-in to understanding complex topics. From Star Wars, Fortnite, Roblox, Pixar, The Matrix, Stranger Things, Minority Report, West World, and so much more.


Empower is all about understanding that consumers are in control. Their expectations are radically transforming how business is done. This section covers all facets of GenZ, gaming, introduction to the Metaverse, Ethereum, NFTs, POAPs, the evolving role of the camera, how to identify a fad vs. trend, and the rise of private messaging, and how the camera is a bridge to intelligence.

Exponential is all about ease and convenience as the core motivation for adopting intelligent systems. This section covers the basics of AI, Machine Learning, Deep Learning, understanding algorithms, why we are in the golden age of AI, predictive decisioning, generative models, five levels of autonomy, digital humans, virtual assistants, and the rise of the proxy web.

Enhanced is all about perception between physical and digital and how and when technology will adapt to us vs. us adapting to it. This section dives deeper into the Metaverse, the role of computer vision, digital synths, multi-modal interfaces, digital twins, holograms, synthetic reality, and my prediction of when we will see multi-modal at scale.

Experience is an entirely new section that helps the audience understand how to think about emerging trend territories and how it applies to their business, from understanding how digital channels and behaviors have evolved to how they can personally take steps to better understand the world around them. This section ends with a practical how-to section that provides deeper introductions to the topics of the Metaverse, Ethereum, NFTs, AI, Digital Humans, and Emerging Tech.

I am excited to be back on stage and the future is crystalizing before our eyes. It’s an exciting time and I love educating and inspiring through technology.

A special thank you to the NAPSL team for a great event!

Blending Reality As An AI Avatar

It’s been an incredible experience collaborating with Synthesia. They’ve created a remarkable AI video platform that enables organizations to produce videos featuring AI avatars. Simply select a template, choose an avatar, input your text, add visuals, and presto! A fully-formed video is generated.

I recently had the privilege of being recorded and turned into an AI avatar. The concept of merging the physical and digital worlds has always fascinated me, and now, with my digital twin, we’re making significant strides towards realizing that vision.

The transformation is taking place in the photos above and below.

Here is an example of the finished product.

Follow Tom @BlackFin360

2022 Evolution of Experience Live Keynote

This week I had an opportunity to travel to Maui, Hawaii, to deliver the newly revamped Evolution of Experience keynote. Besides the amazing setting next to the beautiful Maui shoreline, the crowd was fantastic as we dove into some of the latest trends redefining consumer expectations for all industries.


The keynote has been revamped, and I have evolved the core format of Empower, Exponential, and Enhanced by also adding Experience. All with the same trademark of infusing pop culture as a lead-in to understanding complex topics. From Star Wars, Fortnite, Roblox, Pixar, The Matrix, Stranger Things, Minority Report, West World, and so much more.


Empower is all about understanding that consumers are in control. Their expectations are radically transforming how business is done. This section covers all facets of GenZ, gaming, introduction to the Metaverse, Ethereum, NFTs, POAPs, the evolving role of the camera, how to identify a fad vs. trend, and the rise of private messaging, and how the camera is a bridge to intelligence.

Exponential is all about ease and convenience as the core motivation for adopting intelligent systems. This section covers the basics of AI, Machine Learning, Deep Learning, understanding algorithms, why we are in the golden age of AI, predictive decisioning, generative models, five levels of autonomy, digital humans, virtual assistants, and the rise of the proxy web.

Enhanced is all about perception between physical and digital and how and when technology will adapt to us vs. us adapting to it. This section dives deeper into the Metaverse, the role of computer vision, digital synths, multi-modal interfaces, digital twins, holograms, synthetic reality, and my prediction of when we will see multi-modal at scale.

Experience is an entirely new section that helps the audience understand how to think about emerging trend territories and how it applies to their business, from understanding how digital channels and behaviors have evolved to how they can personally take steps to better understand the world around them. This section ends with a practical how-to section that provides deeper introductions to the topics of the Metaverse, Ethereum, NFTs, AI, Digital Humans, and Emerging Tech.

I am excited to be back on stage and the future is crystalizing before our eyes. It’s an exciting time and I love educating and inspiring through technology.

A special thank you to the WSADA team for a great event!