"Machine translation (MT) is the application of computers to the task of translating texts from one natural language to another. One of the very earliest pursuits in computer science, MT has proved to be an elusive goal, but today a number of systems are available which produce output which, if not perfect, is of sufficient quality to be useful in a number of specific domains."
– Definition from the European Association for Machine Translation (EAMT).
In a bid to make transformer models even better for real-world applications, researchers from Google, University of Cambridge, DeepMind and Alan Turing Institute have proposed a new transformer architecture called "Performer" -- based on what they call fast attention via orthogonal random features (FAVOR). Believed to be particularly well suited for language understanding tasks when proposed in 2017, transformer is a novel neural network architecture based on a self-attention mechanism. To date, in addition to achieving SOTA performance in Natural Language Processing and Neural Machine Translation tasks, transformer models have also performed well across other machine learning (ML) tasks such as document generation/summarization, time series prediction, image generation, and analysis of biological sequences. Neural networks usually process language by generating fixed- or variable-length vector-space representations. A transformer however only performs a small, constant number of steps -- in each step, it applies a self-attention mechanism that can directly model relationships between all words in a sentence, regardless of their respective position.
Google's fourth-generation tensor processing units (TPUs), the existence of which weren't publicly revealed until today, can complete AI and machine learning training workloads in close-to-record wall clock time. That's according to the latest set of metrics released by MLPerf, the consortium of over 70 companies and academic institutions behind the MLPerf suite for AI performance benchmarking. It shows clusters of fourth-gen TPUs surpassing the capabilities of third-generation TPUs -- and even those of Nvidia's recently released A100 -- on object detection, image classification, natural language processing, machine translation, and recommendation benchmarks. Google says its fourth-generation TPU offers more than double the matrix multiplication TFLOPs of a third-generation TPU, where a single TFLOP is equivalent to 1 trillion floating-point operations per second. It also offers a "significant" boost in memory bandwidth while benefiting from unspecified advances in interconnect technology.
If you had asked me a year or two ago when Artificial General Intelligence (AGI) would be invented, I'd have told you that we were a long way off. Most experts were saying that AGI was decades away, and some were saying it might not happen at all. The consensus is -- was? -- that all the recent progress in AI concerns so-called "narrow AI," meaning systems that can only perform one specific task. An AGI, or a "strong AI," which could perform any task as well as a human being, is a much harder problem. It is so hard that there isn't a clear roadmap for achieving it, and few researchers are openly working on the topic. GPT-3 is the first model to shake that status-quo seriously. GPT-3 is the latest language model from the OpenAI team.
Migrating a codebase from an archaic programming language such as COBOL to a modern alternative like Java or C is a difficult, resource-intensive task that requires expertise in both the source and target languages. COBOL, for example, is still widely used today in mainframe systems around the world, so companies, governments, and others often must choose whether to manually translate their code bases or commit to maintaining code written in a language that dates back to the 1950s. We've developed TransCoder, an entirely self-supervised neural transcompiler system that can make code migration far easier and more efficient. Our method is the first AI system able to translate code from one programming language to another without requiring parallel data for training. We've demonstrated that TransCoder can successfully translate functions between C, Java, and Python 3. TransCoder outperforms open source and commercial rule-based translation programs.
Natural Language Processing is among the hottest topic in the field of data science. Companies are putting tons of money into research in this field. Everyone is trying to understand Natural Language Processing and its applications to make a career around it. Every business out there wants to integrate it into their business somehow. Because just in a few years' time span, natural language processing has evolved into something so powerful and impactful, which no one could have imagined.
To better understand the landscape of available tools for machine learning production, I decided to look up every AI/ML tool I could find. After filtering out applications companies (e.g., companies that use ML to provide business analytics), tools that aren't being actively developed, and tools that nobody uses, I got 202 tools. Please let me know if there are tools you think I should include but aren't on the list yet! I categorize the tools based on which step of the workflow it supports. I don't include Project setup since it requires project management tools, not ML tools.
Natural language processing, or NLP, is a type of artificial intelligence (AI) that specializes in analyzing human language. Have you ever used Apple's Siri and wondered how it understands (most of) what you're saying? This is an example of NLP in practice. NLP is becoming an essential part of our lives, and together with machine learning and deep learning, produces results that are far superior to what could be achieved just a few years ago. In this article we'll take a closer look at NLP, see how it's applied and learn how it works.
Machine translation is an automated way to translate text or speech from one language to another. It can take volumes of data and provide translations into a large number of supported languages. Although not intended to fully replace human translators, it can provide value when immediate translations are needed for a wide variety of languages. If you're looking to translate content on the web, you have several options. Many popular browsers offer translation capabilities, which are either built in (e.g.
The 58th Annual Meeting of the Association for Computational Linguistics (ACL 2020), the premiere annual conference on AI and language, takes place July 5-10. As is the case with most events currently, ACL will be virtual this year due to COVID-19. At IBM Research AI, we’re excited to share with you — wherever you might be in the world — all the work we’ll have at ACL 2020 designed to advance AI for the enterprise. The ability of AI to master language has been one of IBM Research AI’s key areas of focus for years. The field of Natural Language Processing (NLP) is constantly evolving in efforts to better outfit AI with the ability to communicate similarly to how us humans can. It’s an incredibly challenging area of research. An AI must identify, decipher, and navigate through natural language barriers — tasks like slang, idioms, acronyms, different languages and extracting meaning from multi-format documents, to name a few. To tackle these challenges, IBM released earlier this year a new, four-part mastering language taxonomy…
To better understand the landscape of available tools for machine learning production, I decided to look up every AI/ML tool I could find. After filtering out applications companies (e.g. companies that use ML to provide business analytics), tools that aren't being actively developed, and tools that nobody uses, I got 202 tools. Please let me know if there are tools you think I should include but aren't on the list yet! The landscape is under-developed IV. I categorize the tools based on which step of the workflow that it supports. I don't include Project setup since it requires project management tools, not ML tools.