cogent
DAN GAINOR: Leftist MSNBC changes its name, but it's still the same embarrassment
MSNBC's "Morning Joe" reacted to the networks upcoming name change, "My Source News Opinion World," or MS NOW, on Monday. But don't shed a tear (not that you would, anyway), it's turning into MS NOW. Or, as the New York Times put it, "Goodbye, MSNBC. The far-left network lost its tie to the newsy term "NBC" and looks more like some feminist retread site. Or, as MSNBC President Rebecca Kutler put it, "While our name will be changing, who we are and what we do will not." So, maybe my viewership assessment is correct. Sure, the ship might have made a career of hitting icebergs, but it's got a new name. The fallout from the change was swift. The Times even took a swipe with the follow-up headline: "MSNBC's Rebrand Invites Bemusement and Ridicule." The name switch reflects marketing nonsense as part of the corporate split. It also eliminates the long-standing comparison to MSDNC. The rationalization for the new name is: "My Source for News, Opinion, and the World." CNBC is going to keep its name, according to the Wall Street Journal, but the initials mean something else – "Consumer News and Business Channel," another marketing nuance. The new company will include, "NBCUniversal's cable television networks, including USA Network, CNBC, MSNBC, Oxygen, E!, SYFY and Golf Channel" along with a few other properties, including the formerly useful Rotten Tomatoes movie site. Nobody sane wants MSNBC/MS NOW connected in any way to NBC. It's been a corporate embarrassment for years. They're OK with it looking like the rational folks at CNBC are still connected, but the lunacy of MSNBC gets rebranded. It removes the stain for NBC. The more things change, the more they remain the same. This is the same network where they repeatedly compare President Donald Trump to monsters like Hitler and Stalin. Hosts regularly throw around charges of dictatorship like we are living in 1930s Germany – although somehow they are allowed to say it. Host Tiffany Cross recently claimed the government was grabbing people and "transporting them to concentration camps." And the face of the franchise, MSNBC host Rachel Maddow, told viewers, "We have a consolidating dictatorship in our country." Remember, "Morning Joe" host Joe Scarborough made the most-embarrassing quote of the entire failed Joe Biden presidency: "I've said it for years now, he's cogent.
- North America > United States (1.00)
- Europe > Germany (0.25)
- Europe > Ukraine (0.05)
- Media > Television (1.00)
- Leisure & Entertainment (1.00)
- Government > Regional Government > North America Government > United States Government (0.72)
A Unified Contrastive-Generative Framework for Time Series Classification
Liu, Ziyu, Alavi, Azadeh, Li, Minyi, Zhang, Xiang
Self-supervised learning (SSL) for multivariate time series mainly includes two paradigms: contrastive methods that excel at instance discrimination and generative approaches that model data distributions. While effective individually, their complementary potential remains unexplored. We propose a Contrastive Generative Time series framework (CoGenT), the first framework to unify these paradigms through joint contrastive-generative optimization. CoGenT addresses fundamental limitations of both approaches: it overcomes contrastive learning's sensitivity to high intra-class similarity in temporal data while reducing generative methods' dependence on large datasets. We evaluate CoGenT on six diverse time series datasets. The results show consistent improvements, with up to 59.2% and 14.27% F1 gains over standalone SimCLR and MAE, respectively. Our analysis reveals that the hybrid objective preserves discriminative power while acquiring generative robustness. These findings establish a foundation for hybrid SSL in temporal domains. We will release the code shortly.
COGENT: A Curriculum-oriented Framework for Generating Grade-appropriate Educational Content
Liu, Zhengyuan, Yin, Stella Xin, Goh, Dion Hoe-Lian, Chen, Nancy F.
While Generative AI has demonstrated strong potential and versatility in content generation, its application to educational contexts presents several challenges. Models often fail to align with curriculum standards and maintain grade-appropriate reading levels consistently. Furthermore, STEM education poses additional challenges in balancing scientific explanations with everyday language when introducing complex and abstract ideas and phenomena to younger students. In this work, we propose COGENT, a curriculum-oriented framework for generating grade-appropriate educational content. We incorporate three curriculum components (science concepts, core ideas, and learning objectives), control readability through length, vocabulary, and sentence complexity, and adopt a ``wonder-based'' approach to increase student engagement and interest. We conduct a multi-dimensional evaluation via both LLM-as-a-judge and human expert analysis. Experimental results show that COGENT consistently produces grade-appropriate passages that are comparable or superior to human references. Our work establishes a viable approach for scaling adaptive and high-quality learning resources.
- Asia > Middle East > UAE > Abu Dhabi Emirate > Abu Dhabi (0.14)
- Asia > Singapore (0.05)
- Europe > United Kingdom > England (0.04)
- (5 more...)
- Education > Educational Setting > K-12 Education (1.00)
- Education > Curriculum > Subject-Specific Education (0.90)
Quick thoughts on GPT3
OpenAI, an AI research foundation started by Elon Musk, Sam Altman, Greg Brockman, and a few other leaders in ML, recently released an API and website that allows people to access a new language model called GPT-3. I've had the chance to play with it over the past few days and have been truly amazed by its capabilities. I'd like to start this off by stating that, especially amongst my extremely intelligent ML friends, I am quite the layman, so this post is more aimed for a nontechnical audience and I apologize if I make any technical errors in this post. GPT-3 is essentially a context-based generative AI. What this means is that when the AI is given some sort of context, it then tries to fill in the rest.
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.57)
COGENT: Cognitive Agent for Cogent Analysis
Tecuci, Gheorghe (George Mason University) | Marcu, Dorin (George Mason University) | Boicu, Mihai (George Mason University) | Schum, David (George Mason University)
Timely, relevant, and accurate intelligence analysis is critical to national security, but it is astonishingly complex. This paper provides an intuitive overview of Cogent, a cognitive assistant that facilitates a synergistic integration of analyst's imaginative reasoning with agent's critical reasoning to draw defensible and persuasive conclusions from masses of evidence, in a world that is changing all the time. It presents Cogent's design goals characterizing a new generation of structured analytical tools, introduces the evidence-based analysis concepts on which it is grounded, illustrates a sample session with its current version, and summarizes the cognitive assistance provided to its user.
- North America > United States > Virginia > Fairfax County > Fairfax (0.05)
- North America > United States > District of Columbia > Washington (0.04)
- North America > United States > Virginia > Fairfax County > McLean (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Forward - Backward Greedy Algorithms for Atomic Norm Regularization
Rao, Nikhil, Shah, Parikshit, Wright, Stephen
In many signal processing applications, the aim is to reconstruct a signal that has a simple representation with respect to a certain basis or frame. Fundamental elements of the basis known as "atoms" allow us to define "atomic norms" that can be used to formulate convex regularizations for the reconstruction problem. Efficient algorithms are available to solve these formulations in certain special cases, but an approach that works well for general atomic norms, both in terms of speed and reconstruction accuracy, remains to be found. This paper describes an optimization algorithm called CoGEnT that produces solutions with succinct atomic representations for reconstruction problems, generally formulated with atomic-norm constraints. CoGEnT combines a greedy selection scheme based on the conditional gradient approach with a backward (or "truncation") step that exploits the quadratic nature of the objective to reduce the basis size. We establish convergence properties and validate the algorithm via extensive numerical experiments on a suite of signal processing applications. Our algorithm and analysis also allow for inexact forward steps and for occasional enhancements of the current representation to be performed. CoGEnT can outperform the basic conditional gradient method, and indeed many methods that are tailored to specific applications, when the enhancement and truncation steps are defined appropriately. We also introduce several novel applications that are enabled by the atomic-norm framework, including tensor completion, moment problems in signal processing, and graph deconvolution.
- North America > United States > Wisconsin > Dane County > Madison (0.04)
- Europe > Portugal > Lisbon > Lisbon (0.04)
- Europe > Netherlands > North Holland > Amsterdam (0.04)