DynaSys is invited to demonstrate our advanced Artificial Intelligence (AI) solutions and NVIDIA DGX Station at LSCM Logistics Summit 2019. Many guests are impressed by our solutions today.

LSCM Logistic Summit_2 LSCM Logistic Summit _1

DynaSys has won the championship in NVIDIA NPN Partner Bootcamp 2019. This achievement has reaffirmed our deep knowledge in selling and supporting the AI market for HK and Macao.

 

In this era of diverse and rapid change, retailers are looking for success at the Hong Kong Retail Summit. DynaSys provides SkyREC A.I. video analytics and the SAP C/4HANA suite, which massively enhances the customer experience.

Retail will never die, but a retail that fails to excite will never survive.

Increase customer engagement with products using AI video analytics for retail store. Learn more as above with case studies.

Data scientists working in data analytics, machine learning and deep learning will get a massive speed boost with NVIDIA’s new CUDA-X AI libraries.

Unlocking the flexibility of Tensor Core GPUs, CUDA-X AI accelerates:

  • … data science from ingest of data, to ETL, to model training, to deployment.
  • … machine learning algorithms for regression, classification, clustering.
  • … every deep learning training framework and, with this release, automatically optimizes for NVIDIA Tensor Core GPUs.
  • … inference and large-scale Kubernetes deployment in the cloud.
  • … data science in PC, workstation, supercomputers cloud, and enterprise data centers.
  • … data science in Amazon Web Services, Google Cloud and Microsoft Azure AI services.
  • … data science.

CUDA-X AI accelerates data science.

Introduced today at NVIDIA’s GPU Technology Conference, CUDA-X AI is the only end-to-end platform for the acceleration of data science.

CUDA-X AI arrives as businesses turn to AI — deep learning, machine learning and data analytics — to make data more useful.

The typical workflow for all these: data processing, feature determination, training, verification and deployment.

CUDA-X AI unlocks the flexibility of our NVIDIA Tensor Core GPUs to uniquely address this end-to-end AI pipeline.

Capable of speeding up machine learning and data science workloads by as much as 50x, CUDA-X AI consists of more than a dozen specialized acceleration libraries.

It’s already accelerating data analysis with cuDF, deep learning primitives with cuDNN; machine learning algorithms with cuML; and data processing with DALI, among others.

Together, these libraries accelerate every step in a typical AI workflow, whether it involves using deep learning to train speech and image recognition systems or data analytics to assess the risk profile of a mortgage portfolio.

Each step in these workflows requires processing large volumes of data, and each step benefits from GPU accelerated computing.

Broad Adoption

As a result, CUDA-X AI is relied on by top companies such as Charter, Microsoft, PayPal, SAS and Walmart.

It’s integrated into major deep learning frameworks such as TensorFlow, PyTorch and MXNet.

Major cloud service providers around the world use CUDA-X AI to speed up their cloud services.

And today eight of the world’s leading computer makers announced data science workstations and servers optimized to run NVIDIA’s CUDA-X AI libraries.

Available Everywhere

CUDA-X AI acceleration libraries are freely available as individual downloads or as containerized software stacks from the NVIDIA NGC software hub.

They can be deployed everywhere, including desktops, workstations, servers and on cloud computing platforms.

It’s integrated into all the data science workstations announced at GTC today. And, all the NVIDIA T4 servers announced today are optimized to run CUDA-X AI.

Learn more at https://www.nvidia.com/en-us/technologies/cuda-x.

Shopping in the future may feel a lot like shoplifting does today — without the risk of getting nabbed — if two artificial intelligence startups have their way.

New Zealand’s IMAGR and Silicon Valley’s Mashgin aim to make checking out of grocery stores and company cafeterias a walk in the park. Almost literally.

Many supermarkets offer self-checkout to save shoppers time. IMAGR founder William Chomley wants to skip the checkout altogether, so you can just walk right out the door. It’s similar to the idea behind Amazon Go, being tested in a grocery store in downtown Seattle, which lets customers shop without ever stopping at a cashier on the way out.

IMAGR makes SmartCart, an ordinary grocery cart with an AI computing video camera attached. The device tracks what goes into the cart, tallies the total along the way and syncs that with payment information on the shopper’s mobile phone.

“We want to give people the ability to shop as they normally would, and then just walk past the cashier and out of the store,” Chomley said.

High Noon at the Checkout Counter

Mashgin was born out of frustration over lunch breaks spent waiting in lines rather than chatting with friends. It’s installed its automated checkout system, also called Mashgin, in several Silicon Valley company cafeterias, including NVIDIA’s. Using GPU deep learning and computer vision, it recognizes your soup, salad or soda faster than you can gulp.

The elegant Mashgin self-checkout station features a very simple user interface. Customers simply place their lunch on the device, where five 3D cameras examine it from different angles to identify and price each item. To pay, customers swipe a credit card.

Demonstration of a future version of the Mashgin AI cafeteria checkout.
This animation depicts a future version of the Mashgin AI cafeteria checkout. Currently the device detects packaged goods, soups, salads and takeout containers, but is still being trained to identify foods on a plate. Animation courtesy of Mashgin.

The startup trained its system on a dataset of common items found in cafeterias, using the CUDA parallel computing platform, NVIDIA GeForce GTX 1080 GPUs and cuDNN with the Caffe deep learning framework. Mashgin customizes its system for each company’s cafeteria, and its deep learning algorithm learns new items as more people use it.

“It’s a huge market and there’s this big problem,” said Abhinai Srivastava, who founded the company with Mukul Dhankhar. “Everyone wants to eat at 12 o’clock.”

Catching Rays, Not Delays

IMAGR’s Chomley created SmartCart because he wasn’t getting enough sunshine. Stuck behind his computer screen at an investment fund most days, he yearned to spend a few minutes soaking up rays during lunch. Instead, the line for food at a small grocery near his office ate up his entire break.

Chomley quit his job and began work on what is now SmartCart. After several false starts — at one point, he had to take a job moving furniture to keep the company afloat — he and the IMAGR team set their course on deep learning and computer vision to enable SmartCart.

Using our TITAN X GPU and the TensorFlow deep learning framework, IMAGR initially trained its algorithms on images of grocery store products. Next, it used the SmartCart video camera to learn to recognize products put into or removed from the cart — say you reconsidered that half-gallon of chocolate chocolate-chip ice cream over a second bunch of kale. Finally, the team trained the algorithm on barcodes to learn prices.

IMAGR is planning a small SmartCart trial at a New Zealand grocery chain within the next couple of months. Chomley said several of the world’s largest supermarket chains have expressed interest in SmartCart.

“People just don’t want to be standing in huge lines,” he said. “They want to get in and get out.”

Source: NVIDIA Blog

You ask, AI delivers.

At least, that’s the concept that Kevin Peterson is trying to achieve with his robotics company, Marble. It recently made news for deploying food delivery robots onto the streets of San Francisco.

Peterson, Marble’s co-founder and software lead, joined this week’s AI Podcast to talk about their efforts to integrate AI into the delivery process.

Marble’s robots, all named “Happy,” look like a white boxcar about the size of a mobility scooter. They’re complete with a trunk, where it stores packages. Users get a code with their delivery confirmation to access their packages.

“We want everyone’s first interaction with the robot to be delightful, actually,” explained Peterson in a conversation with Michael Copeland, the host of NVIDIA’s AI Podcast. “So we spend a lot of time designing that interaction and making sure the vehicle is operating in a way that looks good and is good.”

To provide efficient delivery, the Marble team uses a 3D map system to plan out the best routes for their delivery bots. According to Peterson, the robot has a program that detects last-minute route obstacles, and then will request a re-route.

For Peterson, automating delivery systems is only the beginning.

“There’s a huge amount of impact in the world that comes from having these kinds of autonomous vehicles out there,” he says.

Source: NVIDIA Blog

Today, the demand for on-demand image processing is increasing. Every program, from Microsoft Office which may run slowly on Windows 10, to advanced imaging software like AutoCAD and Autodesk, for companies in the construction and design industries, can benefit from better image processing.

The NVIDIA GRID License allows for GPU resources to be spread among users, by using a single GPU server. Using Live Migration, this allows for superior image and video processing – and you can even use NVIDIA GRID to run other complex AI Deep Learning calculations, while still maintaining smooth operation. Learn more on Youtube.

what_is_nvidia_grid_license

What Is A GPU?

Comments are off

GPU stands for Graphics Processing Unit. GPUs have many more cores than a comparable CPU – the most powerful graphics card have more than 5,000 cores, while most CPUs have a maximum of 10 cores. This makes GPUs faster and more efficient than CPUs for repetitive tasks such as AI development. Learn more on Youtube video.

what is gpu

Learn more on YouTube.

Is NVIDIA’s DGX Station Better Than DIY Machines For AI/Deep Learning

If you’re interested in AI and deep learning, you must either buy a supercomputer that can handle AI Model Training, or build your own machine to create a DIY machine. But is this a good idea? We don’t think so.

Save Time And IT Resources

Building your own AI computer is usually cheaper than an off-the-shelf unit the NVIDIA DGX Station – but only if you ignore the time and IT resources required to run the unit, download and update software, and maintain the computer. When you factor these costs in, the NVIDIA DGX Station is a much better option.

The NVIDIA DGX Station – Ready To Go For AI Projects

The NVIDIA DGX Station is ideal for AI scientists. This compact supercomputer is small enough to place on your desk, and has low power consumption.

It also offers enhanced security, as your sensitive data does not have to be on the cloud. And because it’s embedded with pre-installed AI software, you can start AI model training in just 2-4 hours. There’s no need to spend extra time and money on more IT equipment or on support services.

Learn more rental NVIDIA DGX Station and NVIDIA Deep Learning Institute.