Frederic Van Haren

Frederic Van Haren is the Chief Technology Officer @HighFens. He has over 20 years of experience in high tech and is known for his insights in HPC, Big Data and AI from his hands-on experience leading research and development teams. He has provided technical leadership and strategic direction in the Telecom and Speech markets. He spent more than a decade at Nuance Communications building large HPC and AI environments from the ground up and is frequently invited to speak at events to provide his vision on the HPC, AI, and storage markets. Frederic has also served as the president of a variety of technology user groups promoting the use of innovative technology. As an engineer, he enjoys working directly with engineering teams from technology vendors and on challenging customer projects. Frederic lives in Massachusetts, USA but grew up in the northern part of Belgium where he received his Masters in Electrical Engineering, Electronics and Automation.”

AI Trends That’ll Impact Businesses in 2020

Nearly 84 percent of C-suites say they will need artificial intelligence (AI) to achieve their growth objectives, according to a recent report published by the consulting firm Accenture – which surveyed 1,500 executives across 16 industries. In the same report, 75 percent of executives said they expect to go out of business within the next five years if they do not deploy AI effectively.
Read more

Natural Language Processing (NLP): meaningful advancements with BERT

Of all the data that is available to us, only 20% [1] of it is in a structured or in a pre-defined format. The remaining 80% of the data is in an unstructured form, with most of it being textual data. The lack of structure increases complexity and makes it challenging to process data, find patterns, and apply analytics in a meaningful way. Some examples of unstructured data are documents, audio files, images, and videos. Many sources on the Internet, such as social media and news sites, create massive amounts of unstructured data.
Read more

Accelerate Artificial Intelligence with Transfer Learning

Any article on Artificial Intelligence (AI) will point out that trying to address an AI problem requires large amounts of data to deliver meaningful results. To train the data into a usable AI model, a massive amount of compute capacity is needed. On top of that, it is challenging to find resources from a limited pool of people with skills that are in high demand. Putting all the pieces together is costly and time-consuming.
Read more

The Data-centric World of Artificial Intelligence (AI)

[Compute-centric]

In the early days of High-Performance Computing (HPC) the concept of solving complex problems was driven by how much processing power was available to achieve a goal within a reasonable amount of time. At the same time, the algorithms that made up the HPC applications were designed to take full advantage of the processing capabilities with no limits in sight.…

Read more

From Big Data to Machine Learning to Deep Learning the progress of AI

[HPC]

High-Performance Computing (HPC) became popular in the ‘60s with governments and academics. They required to solve large computational problems and could afford to take advantage of the advancements in computer technology. Until then most of the complex problems were solved manually by people, a slow process and error-prone. The HPC systems could solve those problems a lot faster and more accurately.

Read more