Remove 2008 Remove Computer Science Remove Natural Language Processing
article thumbnail

Customizing coding companions for organizations

AWS Machine Learning Blog

Her research interests lie in Natural Language Processing, AI4Code and generative AI. Xiaofei has been serving as the science manager for several services including Kendra, Contact Lens, and most recently CodeWhisperer and CodeGuru Security. He received his PhD in Computer Science from Purdue University in 2008.

AWS 100
article thumbnail

Federated Learning on AWS with FedML: Health analytics without sharing sensitive data – Part 1

AWS Machine Learning Blog

At the application level, such as computer vision, natural language processing, and data mining, data scientists and engineers only need to write the model, data, and trainer in the same way as a standalone program and then pass it to the FedMLRunner object to complete all the processes, as shown in the following code.

AWS 85
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Joshua Walker: Using Data to Improve the Legal System

DataRobot

Natural language processing used to be a dirty word because it didn’t really work. That alone makes him one of the main drivers of computer science.”. That is what led Joshua to found Lex Machina in 2008. That’s something we can grapple with, and that doesn’t terrify people.

article thumbnail

Financial text generation using a domain-adapted fine-tuned large language model in Amazon SageMaker JumpStart

AWS Machine Learning Blog

Large language models (LLMs) with billions of parameters are currently at the forefront of natural language processing (NLP). These models are shaking up the field with their incredible abilities to generate text, analyze sentiment, translate languages, and much more.

ML 75
article thumbnail

Identifying defense coverage schemes in NFL’s Next Gen Stats

AWS Machine Learning Blog

” Advances in neural information processing systems 32 (2019). He is broadly interested in Deep Learning and Natural Language Processing. He has a degree in Mathematics and Computer Science from the University of Illinois at Urbana Champaign. Van der Maaten, Laurens, and Geoffrey Hinton.

ML 75
article thumbnail

Domain-adaptation Fine-tuning of Foundation Models in Amazon SageMaker JumpStart on Financial data

AWS Machine Learning Blog

Large language models (LLMs) with billions of parameters are currently at the forefront of natural language processing (NLP). These models are shaking up the field with their incredible abilities to generate text, analyze sentiment, translate languages, and much more.

ML 52
article thumbnail

A review of purpose-built accelerators for financial services

AWS Machine Learning Blog

In computer science, a number can be represented with different levels of precision, such as double precision (FP64), single precision (FP32), and half-precision (FP16). The benchmark used is the RoBERTa-Base, a popular model used in natural language processing (NLP) applications, that uses the transformer architecture.

AWS 92