Showing posts with label GPU. Show all posts
Showing posts with label GPU. Show all posts

Wednesday, 27 March 2019

Build your own Robust Deep Learning Environment in Minutes by Dipanjan (DJ) Sarkar via @Medium

You have many options for configuring a GPU-enabled training environment in the cloud. This article details some of them and describes minimal configuration cloud-based deep learning environment providers (Google Colaboratory, Paperspace Gradient°, Lambda GPU Cloud, AWS Deep Learning AMIs, etc.) that give you point-and-click access to the environment.

I found this really interesting and it was great to have it explained so clearly.

Tuesday, 29 January 2019

WEBINAR: Cutting Time, Complexity and Costs from Data Science to Production - 6th February 2019

WEBINAR

Cutting Time, Complexity and Costs from Data Science to Production

One-click (really!) deployment to production without any heavy lifting from data and DevOps engineers
Wednesday, February 6 at 8am PT
Imagine a system where one collects real-time data, develops a machine learning model… Runs analysis and training on powerful GPUs… Clicks on a magic button and then deploys code and ML models to production… All without any heavy lifting from data engineers. Today, data scientists work on laptops with just a subset of data and time is wasted while waiting for data and compute.
It’s about efficient use of time! Join Iguazio and NVIDIA so that you can get home early today! Learn how to speed up data science from development to production:
  • Access to large scale, real-time and operational data without waiting for ETL
  • Run high performance analytics and ML on NVIDIA GPUs (Rapids)
  • Work on a shared, pre-integrated Kubernetes cluster with Jupyter notebook and leading data science tools
Featured Speakers:
Yaron Haviv, CTO, Iguazio
Or Zilberman, Data Scientist, Iguazio
Jacci Cenci, Sr Technical Marketing Engineer, NVIDIA
Register here


Saturday, 27 January 2018

The 10 most important breakthroughs in Artificial Intelligence by James O'Malley via @techradar

“Artificial Intelligence” is currently the hottest buzzword in tech. And with good reason - after decades of research and development, the last few years have seen a number of techniques that have previously been the preserve of science fiction slowly transform into science fact.

Great reminder of what has already been achieved.  It's only when you stop and think about it that you realise just how far we have already come.

Sunday, 3 December 2017

It’s time to solve deep learning’s productivity problem by Hillery Hunter via @VentureBeat

Deep learning is fuelling breakthroughs in everything from consumer mobile apps to image recognition. Yet running Deep learning-based AI models poses many challenges. One of the most difficult roadblocks is the time it takes to train the models.

Hillery makes some good points and gives a lot to think about.

Friday, 20 June 2014

Who are the Big Data players?

This blog post by +SQream Technologies looks at who really are the big players in the world of Big Data.

In it they look at the 5 areas that are making the most use of Big Data so far - and it is no surprise that they are all areas that have large amounts of data so can maximise on the benefits.

If you are ever at a Big Data event I suggest you go investigate their GPU as it sounds exciting.