Artificial Intelligence

Natural Language Processing (NLP): meaningful advancements with BERT

Of all the data that is available to us, only 20% [1] of it is in a structured or in a pre-defined format. The remaining 80% of the data is in an unstructured form, with most of it being textual data. The lack of structure increases complexity and makes it challenging to process data, find patterns, and apply analytics in a meaningful way. Some examples of unstructured data are documents, audio files, images, and videos. Many sources on the Internet, such as social media and news sites, create massive amounts of unstructured data.
Read more

Accelerate Artificial Intelligence with Transfer Learning

Any article on Artificial Intelligence (AI) will point out that trying to address an AI problem requires large amounts of data to deliver meaningful results. To train the data into a usable AI model, a massive amount of compute capacity is needed. On top of that, it is challenging to find resources from a limited pool of people with skills that are in high demand. Putting all the pieces together is costly and time-consuming.
Read more

Where the Rubber Meets the Road: How Users See the IT4IT Standard Building Competitive Business Advantage

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re reading BriefingsDirect. Our next IT operations strategy panel discussion explores how the IT4IT™️ Reference Architecture for IT management creates demonstrated business benefits – in many ways, across many types of organizations.

Since its delivery in 2015 by The Open Group, IT4IT has focused on defining, sourcing, consuming, and managing services across the IT function’s value stream to its stakeholders.

Read more

The Data-centric World of Artificial Intelligence (AI)

[Compute-centric]

In the early days of High-Performance Computing (HPC) the concept of solving complex problems was driven by how much processing power was available to achieve a goal within a reasonable amount of time. At the same time, the algorithms that made up the HPC applications were designed to take full advantage of the processing capabilities with no limits in sight.…

Read more

From Big Data to Machine Learning to Deep Learning the progress of AI

[HPC]

High-Performance Computing (HPC) became popular in the ‘60s with governments and academics. They required to solve large computational problems and could afford to take advantage of the advancements in computer technology. Until then most of the complex problems were solved manually by people, a slow process and error-prone. The HPC systems could solve those problems a lot faster and more accurately.

Read more

The world’s most intelligent storage with new AI, cloud data mobility, and all-flash enhancements.

[vc_row][vc_column][vc_empty_space][vc_column_text]

Recently, I was in Madrid for HPE Discover. With the opening of the event, I shared big storage news. My article talks about enhancements to InfoSight, Memory-Driven Flash, HPE Cloud Volumes and more. So read on!

Dr. Heinz-Herman Adam leads a standing-room only Connect Tech Forum at HPE Discover Madrid about how the University of Münster uses 3PAR in their multi-tier storage strategy.

Read more

Your Future Doctor May Not Be Human
But It May Be Powered By HPE

[vc_row][vc_column][vc_empty_space][vc_column_text]

I am a nerd.  My nerdiness embraces characters such as Marvel’s “the Avengers”.  Not the “Infinity War” kind but, the “Age of Ultron” kind.  In the “Age of Ultron”, the headliner is an architype for artificial intelligence gone bad.  But then, what could one expect when he Official Handbook of the Marvel Universe lists his occupation as “would-be conqueror, enslaver of men”. 

Read more