Goto

Collaborating Authors

 datasciencecentral


Future of Education: Application not Regurgitation of Knowledge – Part II - DataScienceCentral.com

#artificialintelligence

AI technologies like ChatGPT are necessitating a fundamental overhaul of our educational systems and institutions. Getting the right answers to predetermined tests is no longer sufficient in an age where AI can access, integrate, and recite knowledge billions if not trillions of times faster than the human mind. So, what are the skills, capabilities, and experiences that our students and citizens will need to prosper in an age where personal and professional success will be based on the application, not the memorization and regurgitation, of knowledge? Let's continue that conversation here in Part II to define the requirements for humans to excel in creating organizational and societal value in a world dominated by AI and Big Data. Many organizations engage in a "wear'em down" decision-making process when dealing with wicked hard challenges with multiple opposing views.


Future of Education: Application not Regurgitation of Knowledge – Part I - DataScienceCentral.com

#artificialintelligence

When I was getting my MBA at the University of Iowa in 1981, my advisor Gary Fethke (who would later serve as University of Iowa interim president and Emeritus Professor in Business Analytics) convinced me to take a PhD class in econometrics. I think he was trying to punish me or something. I was totally overwhelmed in the class as student after student quickly answered questions about this economic theorem or that economic concept. Hands were popping up all over the room in a rush to answer these questions, while I sat in the back of the room madly trying to understand the applicability of these theorems and concepts. On the day of the final exam, I thought I was doomed.


Creating Healthy AI Utility Function: Importance of Diversity – Part I - DataScienceCentral.com

#artificialintelligence

Sometimes you start a blog with a hypothesis in mind, and then that intention changes as you research and realize that your original idea was wrong. Yep, this is one of those blogs. Learning can be fun if you let go of pre-existing dogma and learn along your life journey. I've always been curious (a good trait) about economics' role in developing an organization's business-driven AI and Data strategies. And that relationship comes to life when we compare economic value and the AI utility function.


FAIR Content: Better Chatbot Answers and Content Reusability at Scale - DataScienceCentral.com

#artificialintelligence

Back in 2018, I had the privilege of keynoting at one of Semantic Web Company's events in Vienna, as well as attending the full event. It was a great opportunity to immerse myself in the Central European perspective on the utility of Linked Open Data standards and how those standards were being applied. I got to meet innovators at enterprises making good use of Linked Open Vocabularies with the help of SWC's PoolParty semantics platform, Ontotext's GraphDB and Semiodesk's semantic graph development acceleration software, for example. There is so much that is impactful and powerful going on at these kinds of semantic technology events. So many people in the audience grasp the importance of a semantic layer to findable, accessible, interoperable, and reusable (FAIR) data, regardless of its origin and its original form–whether structured data, or document and multimedia content.


Enterprise use cases for GPT-3: How to chat with your own data - DataScienceCentral.com

#artificialintelligence

It's easy to think of LLMs (large language models) as just'hallucinating' or mere generators of text. A glorified LSTM so to speak. While there are some limitations of LLMs (and indeed they are evolving), a far more interesting question to explore is: How can LLMs be used in enterprise applications? In many ways, enterprise applications of LLMs can overcome some of the problems. One possible solution is a combination of Azure Cognitive Search and Azure OpenAI Service. Taking a B2B perspective, the solution involves "chatting with your own data".


Data Science and Machine Learning Mathematical and Statistical Methods - DataScienceCentral.com

#artificialintelligence

Next, we cover Unsupervised Learning techniques such as density estimation, clustering, and principal component analysis. Important tools in unsupervised learning include the cross-entropy training loss, mixture models, the Expectation–Maximization algorithm, and the Singular Value Decomposition. This is followed by Regression. The purpose of this chapter is to explain the mathematical ideas behind regression models and their practical aspects.


Watching the Shift Towards More Symbolic AGI - DataScienceCentral.com

#artificialintelligence

"The fact that LeCun is even considering a hypothesis that embraces symbol manipulation, learned or otherwise, represents a monumental concession, if not a complete about-face. Historians of artificial intelligence should in fact see the Noema essay as a major turning point, in which one of the three pioneers of deep learning first directly acknowledges the inevitability of hybrid AI. And there are other examples such as Andrew Ng, Jürgen Schmidhuber's AI company NNAISENSE and Yoshual Bengio discussion on "System 2" cognition


Empowering Industry 5.0 with Advanced Manufacturing Execution systems - DataScienceCentral.com

#artificialintelligence

Industry 5.0 is a relatively new concept that refers to the integration of human intelligence with advanced technologies such as artificial intelligence, machine learning, and robotics. It represents a new stage in the evolution of industry and manufacturing, where human creativity and problem-solving abilities are combined with cutting-edge technologies to create innovative and efficient manufacturing processes. In the context of Manufacturing Execution Systems (MES), Industry 5.0 refers to the use of advanced technologies to optimize and streamline manufacturing operations, while still leveraging the unique skills and insights of human operators. MES are software systems that track and manage the production process from raw materials to finished products. They provide real-time information on production status, inventory levels, and quality control, and can help identify bottlenecks and inefficiencies in the manufacturing process.


iiot bigdata, Twitter, 2/10/2023 12:06:10 PM, 289067

#artificialintelligence

The graph represents a network of 1,076 Twitter users whose recent tweets contained "iiot bigdata", or who were replied to, mentioned, retweeted or quoted in those tweets, taken from a data set limited to a maximum of 5,000 tweets, tweeted between 3/26/2006 12:00:00 AM and 2/9/2023 5:00:35 PM. The network was obtained from Twitter on Friday, 10 February 2023 at 12:02 UTC. The tweets in the network were tweeted over the 1037-day, 2-hour, 12-minute period from Wednesday, 08 April 2020 at 22:47 UTC to Friday, 10 February 2023 at 01:00 UTC. There is an edge for each "replies-to" relationship in a tweet, an edge for each "mentions" relationship in a tweet, an edge for each "retweet" relationship in a tweet, an edge for each "quote" relationship in a tweet, an edge for each "mention in retweet" relationship in a tweet, an edge for each "mention in reply-to" relationship in a tweet, an edge for each "mention in quote" relationship in a tweet, an edge for each "mention in quote reply-to" relationship in a tweet, and a self-loop edge for each tweet that is not from above. The graph's vertices were grouped by cluster using the Clauset-Newman-Moore cluster algorithm.


DSC Weekly 7 February 2023 - Machine Learning Controversy: From No-Code to No-Math - DataScienceCentral.com

#artificialintelligence

One controversial topic in machine learning circles is code versus no-code. Can you be a real data scientist if you don't code? Of course you can: You may be leveraging platforms and the code is one or two layers below the responsibilities of your job. Maybe you managed to automate coding or outsource that part. It does not mean you don't know how to code. Indeed, with tools like ChatGPT, the future may be less coding rather than more.