Artificial Intelligence

Cyber resilience and machine learning – The perfect partnership

The business of the future will need to take the journey from cybersecurity to cyber resilience if it wishes to protect its valuable data.

Malicious parties are on the hunt for valuable data, and your organization is their target. This has long been the fundamental reason behind cybersecurity, but the truth is that cybersecurity alone is no longer sufficient. Organizations can no longer afford to react to incidents after they have already happened with a “What now?”…

Read more

An Overview of Artificial Intelligence and NonStop

Alan Turing OBE (Officer of the Order of the British Empire) FRS (Fellow of the Royal Society) born 23 June 1912 and died 7 June 1954 was an English mathematician, computer scientist, logician, cryptanalyst, philosopher and theoretical biologist, or what we would call today an under-achiever. Mr. Turing was highly influential in the development of theoretical computer science, providing a formalization of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general-purpose computer.
Read more

Virtual Automotive Assistants

Introduction

We recently designed and deployed an Artificial Intelligence (AI) solution for an automotive customer based on our many years of experience building sizeable High-Performance Computing (HPC) clusters for AI. The proposed data-centric architecture heavily relied on an innovative storage solution from Weka, an HPE partner. The customer was looking for a hybrid solution that would allow them to run on-premises and take advantage of the public cloud for peak capacity and future growth.…

Read more

AI Trends That’ll Impact Businesses in 2020

Nearly 84 percent of C-suites say they will need artificial intelligence (AI) to achieve their growth objectives, according to a recent report published by the consulting firm Accenture – which surveyed 1,500 executives across 16 industries. In the same report, 75 percent of executives said they expect to go out of business within the next five years if they do not deploy AI effectively.
Read more

Natural Language Processing (NLP): meaningful advancements with BERT

Of all the data that is available to us, only 20% [1] of it is in a structured or in a pre-defined format. The remaining 80% of the data is in an unstructured form, with most of it being textual data. The lack of structure increases complexity and makes it challenging to process data, find patterns, and apply analytics in a meaningful way. Some examples of unstructured data are documents, audio files, images, and videos. Many sources on the Internet, such as social media and news sites, create massive amounts of unstructured data.
Read more

Accelerate Artificial Intelligence with Transfer Learning

Any article on Artificial Intelligence (AI) will point out that trying to address an AI problem requires large amounts of data to deliver meaningful results. To train the data into a usable AI model, a massive amount of compute capacity is needed. On top of that, it is challenging to find resources from a limited pool of people with skills that are in high demand. Putting all the pieces together is costly and time-consuming.
Read more

Where the Rubber Meets the Road: How Users See the IT4IT Standard Building Competitive Business Advantage

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re reading BriefingsDirect. Our next IT operations strategy panel discussion explores how the IT4IT™️ Reference Architecture for IT management creates demonstrated business benefits – in many ways, across many types of organizations.

Since its delivery in 2015 by The Open Group, IT4IT has focused on defining, sourcing, consuming, and managing services across the IT function’s value stream to its stakeholders.

Read more

The Data-centric World of Artificial Intelligence (AI)

[Compute-centric]

In the early days of High-Performance Computing (HPC) the concept of solving complex problems was driven by how much processing power was available to achieve a goal within a reasonable amount of time. At the same time, the algorithms that made up the HPC applications were designed to take full advantage of the processing capabilities with no limits in sight.…

Read more

From Big Data to Machine Learning to Deep Learning the progress of AI

[HPC]

High-Performance Computing (HPC) became popular in the ‘60s with governments and academics. They required to solve large computational problems and could afford to take advantage of the advancements in computer technology. Until then most of the complex problems were solved manually by people, a slow process and error-prone. The HPC systems could solve those problems a lot faster and more accurately.

Read more