
When I first heard about Samsung’s AI companion, Bixby, which was launched on March 20, 2017 my first thought was, ‘Don’t Siri, Alexa, Cortana and Google Assistant already have this market cornered?” During a recent conversation with Adam Cheyer, however, I gained a whole new perspective on Bixby and its potential. If you’re an ISV who’s developed voice-enabled apps or has an interest in this topic – I highly recommend giving this interview a read.
Who is Adam Cheyer?
Adam Cheyer has a long history with digital assistant technology. He developed his first digital assistant in the 1990s at Stanford Research Institute, and in 2007 he cofounded Siri, which was acquired by Apple in 2010. Today, he’s a cofounder and VP of engineering at Viv Labs, VP of research and development at Samsung and a founding member of Change.org and Sentient Technologies.
Here are the highlights from our conversation:
How does Bixby compare to the other AI-powered virtual assistants?
Adam Cheyer: This is an entirely different platform than anything else out there. Most competitors are focused on understanding natural language. They give developers limited tools to classify, parse and understand speech and then a human developer has to code the logic that determines the next steps. For instance, if you ask most virtual assistants, ‘What’s the temperature in Boston?’ It will most likely launch a weather map displaying the current temperature in Boston, Massachusetts. A person had to code those responses, including comparing the 13 towns and cities in the US named ‘Boston’ so the user would receive the correct information.
With Bixby, we have AI that sits in the developer tool and it writes the code automatically to handle every use case. The developer defines the follow-up steps using natural language. For example, a user may ask, ‘What’s the weather tomorrow in Boston, Massachusetts?’ After speaking the goal, the system builds the program and interacts and learns from the user.
What’s your vision for Bixby?
Adam Cheyer: We designed Bixby based on four principles:
- One assistant. As a user, I want one digital assistant that can do 50,000 things, not 50,000 different assistants that each have their own experience. We want 95% of Bixby’s development to come from third-party ISVs and only 5% from our developers so that it can expand outside the Samsung ecosystem.
- Device agnostic. Bixby lives in the cloud and interacts with users through a wide variety of devices, including refrigerators and TV sets. Users shouldn’t have to worry about interacting one way with their TV and another way when using their phones. Plus, if a user signs up for Spotify on their phone, they shouldn’t have to repeat the same setup process on their portable speaker.
- A democratic approach to software development. Many of the big companies give web-based tools to developers, but their internal development teams use different tools with way more features. With Bixby, we’re giving the most sophisticated tools to third-party developers—the same exact tools that Samsung developers use to build our own capsules.
- Personalized for each user. With every other virtual assistant, everyone gets the same experience even though two people may have vastly different app portfolios and user preferences. Our goal with Bixby is that it learns each user’s preferences to create a unique experience from person to person.
We haven’t achieved all four goals yet with Bixby 2.0, but those are our guiding principles and goals. Ultimately, we want the digital assistant to become what mobility and web browsers have become – essential parts of our everyday work and personal lives.
How can developers get involved with Bixby?
Adam Cheyer: ISVs can learn more about Bixby and the Bixby Developer Center at https://bixbydevelopers.com/. At the site, you can download sample capsules, which include everything required to interface with our content and services. You can also create a team and begin developing a capsule. Once you’ve completed it, you can submit it for review. Upon approval, your capsule will be available to consumers once the Bixby Marketplace opens later this year.
We also have a Premier Developer Partner program that ISVs can apply to, which offers several additional perks including invitations to exclusive events, Premier Partners also receive access to tutorials and white glove support from the Samsung developer team. To qualify for the program, ISVs must show prior software development success, preferably with voice-enabled apps.
Our next Bixby Developer Session is Saturday, June 22 at the Brooklyn Navy Yard on June 22, 2019. The previous two sessions were well-attended and featured a combination of hands-on entrepreneurial and technical training, networking and a hackathon. Attendees can build capsule in a couple of hours and leave with working prototype that they can continue to evolve. It’s free to attend. Register here.
Final thoughts?
Adam Cheyer: Someone once asked me, how can you compete with Amazon, which has 10,000 people working on Alexa? The beauty of what we’re doing with Bixby is that we’re not building the world’s best assistant. We’re enabling the world to create the best assistant. If we can harness the collective intelligence of the world’s developers, the world wins.
About 70% of US households have at least one Samsung device, and by 2020 every Samsung device will be connected with Bixby built-in. Bixby represents a massive opportunity for software developers to get in on a ground floor opportunity and become part of a technology wave and marketplace with tremendous growth potential.
See Samsung at VOICE Summit 2019 in Newark, NJ (July 22-25, 2019). Register here.
VOICE 2019 will convene more than 5,000 developers, designers, C-level executives, leading brands, and agencies that are reimagining how society interacts with technology through voice. With 150 breakout sessions, keynotes and executive panels, attendees are presented with firsthand access to the foremost pioneers behind breakthroughs in development, design, monetization, and user experience.