• About
    Please note: This session is part of AI Day. Learn more and register for the full event here.

    Did you know more than 1 billion tons of food is wasted every year? What if we could reduce that? What if optimized AI was the key?

    Building AI applications to solve real-world problems is our common challenge as developers. To solve a challenge like food waste, optimization is important.

    In this session, we’ll show you how to optimize and accelerate the performance of your deep learning neural network model to help you achieve really fast AI inference at the edge. This is made possible with Intel’s OpenVINO™, an open-source toolkit that enables neural network model optimization and easy deployment across multiple hardware platforms.

    3 Takeaways:

    • How to run fast AI inference with your existing hardware

    • How to set up and run AI inference with just 6 lines of code

    • How to use the OpenVINO optimization tools to run faster AI inference

    Sponsored by:
  • Price
  • Language
    Anyone with the event link can attend
  • Dial-in available
    (listen only)
    Not available.