For the first couple of years that Microsoft held its Build conference[1], the event was all about Windows. In the years since, the scope has widened and Build has become the company's broad annual developer confab. At this year's show, being held today through Wednesday in Seattle, there is no shortage of data- and AI-related announcements and demonstrations. If you needed proof that both are crucial to Microsoft's success, even eclipsing Windows in importance, this year's show is it.

On the AI side, there's so much to discuss, it's hard to know where to begin. Luckily, in a private briefing, Microsoft's Matt Winkler[2] helped me understand the AI announcements at a depth that allows me to explain it better to you. Without that briefing, I'd just be regurgitating text from press releases. And that's no fun.

I do so like AML and HAM
I'll start with the part that is perhaps the most complicated to explain, but potentially the most interesting: the announced preview of Azure Machine Learning[3] Hardware Accelerated Models. (I am going to refer this service as AMLHAM - this is not Microsoft's acronym, mind you, and despite its sounding like a brand name for an unhealthy luncheon meat, it's still better than typing the full name out each time.)

CNET: Build 2018: Livestream, start time, what to expect[4]

AMLHAM is the output of an internal project at Microsoft with the nerdy name of Project Brainwave[5], and it's all based on a hardware technology called Field Programmable Gate Arrays, or FPGAs[6]. Let's take a look at these terms in one-at-a-time and see if we can't figure it all out.

Gimme an F, gimme a P
An FPGA

Read more from our friends at ZDNet