Artificial Intelligence

Natural Language Processing (NLP): meaningful advancements with BERT

Of all the data that is available to us, only 20% [1] of it is in a structured or in a pre-defined format. The remaining 80% of the data is in an unstructured form, with most of it being textual data. The lack of structure increases complexity and makes it challenging to process data, find patterns, and apply analytics in a meaningful way. Some examples of unstructured data are documents, audio files, images, and videos. Many sources on the Internet, such as social media and news sites, create massive amounts of unstructured data.

Read more

Accelerate Artificial Intelligence with Transfer Learning

Any article on Artificial Intelligence (AI) will point out that trying to address an AI problem requires large amounts of data to deliver meaningful results. To train the data into a usable AI model, a massive amount of compute capacity is needed. On top of that, it is challenging to find resources from a limited pool of people with skills that are in high demand. Putting all the pieces together is costly and time-consuming.

Read more

Where the Rubber Meets the Road: How Users See the IT4IT Standard Building Competitive Business Advantage

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re reading BriefingsDirect. Our next IT operations strategy panel discussion explores how the IT4IT™️ Reference Architecture for IT management creates demonstrated business benefits – in many ways, across many types of organizations. Since its delivery in 2015…

Read more

The Data-centric World of Artificial Intelligence (AI)

[Compute-centric] In the early days of High-Performance Computing (HPC) the concept of solving complex problems was driven by how much processing power was available to achieve a goal within a reasonable amount of time. At the same time, the algorithms that made up the HPC applications were designed to…

Read more