This site may earn affiliate commissions from the links on this folio. Terms of use.

DURHAM, Due north CAROLINA — IBM Watson came to Moogfest 2022, only there were no Jeopardy! questions this time around. If you've been following ExtremeTech, you already know that IBM Watson, an artificially intelligent system capable of answering questions in natural language, has been upward to much more that recently. At Moogfest, IBM Watson team spokesperson Ally Schneider was on hand to outline all of the latest developments.

Everyone remembers Watson from its Jeopardy! operation on television in 2022. But piece of work on the projection was started much earlier — not just in 2006, when iii researchers at IBM first got the idea to build a arrangement for the game bear witness, but actually decades earlier that, as IBM began doing work on natural language processing and cognitive computing in the 1970s.

IBM-Watson

The Jeopardy! Watson system in 2022 had iii main abilities, as Schneider explained. First, it could understand unstructured text. "[Normally] we don't accept to remember about it, but we inherently understand what sentences are, and how verbs, nouns, etc. come together to produce text," Schneider said. Watson could read through human-generated content and parse information technology in a way that other systems haven't been able to exercise before. Side by side, Watson could come up with its own hypotheses, and and so render the one with the highest confidence. Finally, there's a machine learning component — one that's not hard-coded or programmed, but that actually learns as it goes. "When you were dorsum in school, non as well long agone for some, how did your teachers test you to see if you lot understood what you were reading?" Schneider asked. "They would requite y'all feedback on your answers. [For instance], yes, total credit… mayhap you got partial credit… or no, incorrect, here's what you should have done instead." Watson is able to "reason" in the same manner.

Today, after continuous improvements, Watson consists of 30 open-source APIs across four categories: linguistic communication, speech, vision, and data insights. "Watson [today] has the ability to read through and understand unstructured data similar a homo and pull out the relevant answers and insights and now images," Schneider said. She then began to illustrate some recent examples of Watson's ability. The first and arguably most significant one was a joint effort with Memorial Sloan Kettering Cancer Center. The goal was to train Watson to remember similar a doctor, in lodge to assist oncologists working with breast and colon cancers. IBM's squad fed Watson a steady nutrition of medical journals, clinical trial results, encyclopedias, and textbooks to teach it the language of medicine.

From there, Watson could look at a patient's individual information and compare it confronting what the system knows about medicine, and so come up dorsum with recommended handling options. Schneider said it'southward nevertheless up to the doctor to decide how to use that information; information technology'south not a question of human versus automobile, simply rather, how machines can heighten what humans can already perform. In this case, the goal was to empower doctors so that they don't have to read an impossible 160 hours worth of material each calendar week — an actual estimated figure for how much new research is being published on a weekly footing!

Watson Logo

Next up was an application for the music industry. Quantone delivers in-depth information on music consumption. It not only leverages structured metadata the style Pandora, Spotify, and other music services practice, such as the genre of music, the number of beats in songs, and so on, but using IBM Watson technologies, it can as well process unstructured data, such equally album reviews, artist-curated content, and natural language nomenclature. Using Quantone, every bit Schneider put it, an end user tin can say, "I'm looking for a playlist reminiscent of Michael Jackson from a certain time period," and get an reply that too pulls in and considers unstructured data.

Content creators tin can likewise benefit from AI-infused programming. Sampack offers algorithmically and artistically generated samples that are royalty-free. It'south essentially an automated license-free music sample generator. It takes in descriptions of tones (such equally "dark" or "mellow") and then translates them into an sound sample using Watson's Tone Analyzer capability. Sampack tin can empathise descriptions and emotions and translate them into music effects, sounds, and filters.

IBM also published a cookbook recently, which as Schneider pointed out isn't something y'all would have expected to hear earlier it happened. The book is calledCerebral Cooking with Chef Watson: Recipes for Innovation from IBM & the Found of Culinary Education. Watson would analyze the molecular construction of foods, figured out what goes well together, take in inputs such as specific ingredients and what to exclude (such as gluten or other allergy triggers), and then create 100 new recipes using that query. It doesn't search through an existing recipe database for these, either; instead, it creates 100 new recipes based on your inputs. The first recipe is unremarkably pretty normal; by the time information technology gets to recipe 100, it'due south "a little out at that place," as Schneider put information technology.

In the fine art globe, World of Watson was a recent exhibit (pictured below) by Stephen Holding in Brooklyn, in collaboration with IBM Watson using a divergence of a color API. Watson mined through Watson-specific brand imagery and came up with a suggested color palette for Property to use. The goal was to invoke innovation, passion, and creativity with an original piece of art.

Stephen Holding IBM Watson World of Watson Art

Finally, IBM Watson partnered with fashion label Marchesa for the recent Metropolitan Museum of Art gala with model Karolina Kurkova. Watson was tasked with coming up with a new wearing apparel design that was "inherently Marchesa and truthful to the brand." Watson was involved in every step of the mode. Using another colour deviation API, Watson mined through hundreds of images from Marchesa, including model photos, to get a feel for the color palette, Schneider said. So Inno360 (an IBM Watson ecosystem partner) used several APIs and considered 40,000 options for textile. With inputs from Marchesa that were consistent with the brand, only while likewise evaluating fabrics that would work with embeddded LEDs, Watson came upwards with 35 distinct choices. The 3rd step involved embedding the LED technology into the dress using the tone analyzer, with specific colors being lit up through the flowers.

Today, anyone can become started working with IBM Watson by heading to IBM BlueMix and signing up for a Watson Programmer Deject account. Back in Feb 2022, IBM boosted Watson Programmer Cloud with spoken communication-to-text, prototype analysis, visual recognition, and the ability to clarify tradeoffs between different drug candidates. In July last year, Watson gained a new Tone Analyzer that could scan a piece of text and and then critique the tone of your writing. We've also interviewed IBM's Jerome Pesenti on many of the latest Watson developments.