Remove 2011 Remove Computer Science Remove Deep Learning
article thumbnail

Meet the Research Scientist: Shirley Ho

NYU Center for Data Science

What sets Dr. Ho apart is her pioneering work in applying deep learning techniques to astrophysics. I’m excited to be part of CDS because it provides a unique environment where cutting-edge data science methods can be developed and applied to push the boundaries of science,” said Ho.

article thumbnail

Amazon EC2 P5e instances are generally available

AWS Machine Learning Blog

To address customer needs for high performance and scalability in deep learning, generative AI, and HPC workloads, we are happy to announce the general availability of Amazon Elastic Compute Cloud (Amazon EC2) P5e instances, powered by NVIDIA H200 Tensor Core GPUs. degree from the University of Science and a Ph.D.

AWS 110
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

What Is a Transformer Model?

Hacker News

“Transformers made self-supervised learning possible, and AI jumped to warp speed,” said NVIDIA founder and CEO Jensen Huang in his keynote address this week at GTC. Transformers are in many cases replacing convolutional and recurrent neural networks (CNNs and RNNs), the most popular types of deep learning models just five years ago.

article thumbnail

Announcing new Jupyter contributions by AWS to democratize generative AI and scale ML workloads

AWS Machine Learning Blog

Project Jupyter is a multi-stakeholder, open-source project that builds applications, open standards, and tools for data science, machine learning (ML), and computational science. The distribution is versioned using SemVer and will be released on a regular basis moving forward.

ML 104
article thumbnail

What Is ChatGPT Doing … and Why Does It Work?

Hacker News

And in fact the big breakthrough in “deep learning” that occurred around 2011 was associated with the discovery that in some sense it can be easier to do (at least approximate) minimization when there are lots of weights involved than when there are fairly few.