Goto

Collaborating Authors

 gtc



SupplementaryMaterial

Neural Information Processing Systems

Fair machine learning.Generally, fair machine learning methods fall into three categories: preprocessing, in-processing, and post-processing [44, 7]. In this paper, we focus on in-processing methods thatmodify learning algorithms toremovediscrimination during thetraining process. All of those works are for indistribution fairness, and we investigate out-of-distribution fairness in this paper. Weuse LAFTR [42],anadversarial learning method that showsadvanced performance onfairness [47],tolearn a fair model in the source domain and adapt it to the target domain. We also test CFair[72] in our experiments.



GTC: GNN-Transformer Co-contrastive Learning for Self-supervised Heterogeneous Graph Representation

Sun, Yundong, Zhu, Dongjie, Wang, Yansong, Tian, Zhaoshuo

arXiv.org Artificial Intelligence

Graph Neural Networks (GNNs) have emerged as the most powerful weapon for various graph tasks due to the message-passing mechanism's great local information aggregation ability. However, over-smoothing has always hindered GNNs from going deeper and capturing multi-hop neighbors. Unlike GNNs, Transformers can model global information and multi-hop interactions via multi-head self-attention and a proper Transformer structure can show more immunity to the over-smoothing problem. So, can we propose a novel framework to combine GNN and Transformer, integrating both GNN's local information aggregation and Transformer's global information modeling ability to eliminate the over-smoothing problem? To realize this, this paper proposes a collaborative learning scheme for GNN-Transformer and constructs GTC architecture. GTC leverages the GNN and Transformer branch to encode node information from different views respectively, and establishes contrastive learning tasks based on the encoded cross-view information to realize self-supervised heterogeneous graph representation. For the Transformer branch, we propose Metapath-aware Hop2Token and CG-Hetphormer, which can cooperate with GNN to attentively encode neighborhood information from different levels. As far as we know, this is the first attempt in the field of graph representation learning to utilize both GNN and Transformer to collaboratively capture different view information and conduct cross-view contrastive learning. The experiments on real datasets show that GTC exhibits superior performance compared with state-of-the-art methods. Codes can be available at https://github.com/PHD-lanyu/GTC.


NVIDIA Celebrates 1 Million Jetson Developers Worldwide at GTC

#artificialintelligence

A million developers across the globe are now using the NVIDIA Jetson platform for edge AI and robotics to build innovative technologies. Plus, more than 6,000 companies -- a third of which are startups -- have integrated the platform with their products. These milestones and more will be celebrated during the NVIDIA Jetson Edge AI Developer Days at GTC, a global conference for the era of AI and the metaverse, taking place online March 20-23. Register free to learn more about the Jetson platform and begin developing the next generation of edge AI and robotics. Atlanta-based Kris Kersey, the mind behind the popular YouTube channel Kersey Fabrications, is one developer using the NVIDIA Jetson platform for his one-in-a-million technological innovations.


Your NVIDIA Systems Just Got Faster - KDnuggets

#artificialintelligence

NVIDIA AI is not something you can download. It is not one package. Yet almost everything our company builds touches and contributes to AI. As NVIDIA founder and CEO Jensen Huang said during his opening GTC keynote, developers use AI to achieve groundbreaking science, solve the world's most complex problems, and revolutionize industries. That work is packaged into software development kits, libraries, frameworks and tools.


World-leading AI research and inclusion at the forefront of this year's NVIDIA GTC

#artificialintelligence

This article is part of the VB Lab / NVIDIA GTC insight series. "The story of GTC is in many ways the story of NVIDIA, and it's also the story of what's happening in technology," says Greg Estes, VP of corporate marketing and developer programs at NVIDIA. Twelve years ago, GTC began as a conference focused squarely on GPUs, and at that time, that meant primarily graphics and gaming. "But then people figured out that GPUs are the perfect architecture for AI," says Estes. GTC is now billed as the conference for AI innovators, developers, technologists, startups and creatives, and this year it will offer over 1,500 sessions covering breakthroughs in AI, data center, accelerated computing, autonomous vehicles, health care, intelligent networking, game development, and more.


Say What? Conversational AI Takes the Mic at GTC

#artificialintelligence

Conversational AI is revolutionizing how businesses operate in every industry with applications like virtual agents, chatbots and assistants. Creating an intelligent and intuitive app involves quickly adapting new state-of-the-art research and deploying it in production. You can learn about the latest advancements in this area at the GPU Technology Conference, taking place October 5-9. At GTC, researchers and developers from leading institutions across the globe will share new techniques and innovations in speech recognition, natural language processing and text-to-speech technologies. At GTC, the NVIDIA Deep Learning Institute is offering instructor-led, hands-on training on how to use Transformer-based natural language processing models for text classification tasks, such as categorizing documents.


Field Report: GPU Technology Conference 2019 #GTC19 - insideBIGDATA

#artificialintelligence

I eagerly attended my 3rd GPU Technology Conference (GTC): "Deep Learning & AI Conference," in Silicon Valley, March 23-26 as a guest of host NVIDIA. GTC has become my favorite tech event of the year due to its highly focused topic areas that align well with my own; data science, machine learning, AI, and deep learning; plus the show has an academic feel that I appreciate. NVIDIA CEO Jensen Huang delivered another one of his patented marathon keynote address which unveiled the company's vision for the upcoming year. The company had to move the keynote's location from the San Jose Convention Center to a very large hall at San Jose State University (complete with pedicabs provided by Kinetica). At around 2 hours and 40 minutes, Huang's seamless and riviting keynotes are masterful with no notes or teleprompter used.


This Is The Only AI Conference You Need To Attend This Year

#artificialintelligence

It's 2019 and AI is well past the hype phase. The technology has advanced, with faster computing chips, smaller form factors, and improved power efficiency. As a result, there has been explosive growth in online AI training courses to get developers and data scientists started and advance their skills. Inboxes now flood with invitations to the newest AI conferences you "must attend." They offer a chaos of information on the newest advancements and resources to stay up to date.