Monthly Roundup - April 2018

May 4, 2018

Here's our roundup of AI and ML in the news we found wanted to share and offer our take.

 

 

 

Non-tech businesses are beginning to use artificial intelligence at scale

 

Artificial Intelligence, once relegated to the tech sector, is now becoming a mainstream, multi-industry phenomenon within the business world. Customer service, finance and human resources are three areas where AI is expected to make huge strides in the near future, while entire industries will eventually feel the effects. Using AI to read applicants' facial expressions, insurance companies decide about loan approvals. Johnson and Johnson, for example, uses AI to sort out most prospective job applications and candidates. Technologies like robotics and computer vision are no longer only of interest of tech giants like Amazon, Google or Alibaba, but are also becoming attractive and possible for smaller businesses.

 

Pedro Domingos on the Arms Race in Artificial Intelligence
 

“He who leads in artificial intelligence will rule the world”- Vladimir Putin.

 

In a recent interview for the German “Der Spiegel”, professor Pedro Domingos from the University of Washington in Seattle speaks about his book on artificial intelligence “The Master Algorithm”, taking an optimistic standpoint for the technology. His view is in line with the Silicon Valley views. 


Pedro Domingos talks about the book in terms of global dominance by using AI technology, giving the pros and the cons of the latest arms race that is lead with AI tools. He explains the Chinese and the Russian interest in the technology, as well as the efforts big industry players, such as Google and Facebook make in using machine learning and artificial intelligence. Two major AI areas of conflict for international stakeholders are accuracy and explainability. The professor is in favour of accuracy and thinks that AI disrupts, but, in the end, enhances our lives in a positive way.


Text Embedding Models Contain Bias. Here's Why That Matters

 

This discussion on problematic biases in machine learning models brings to light the issue of how human data encodes human bias. While unintended, these biases can lead to a negative user experience. For example, in one cited case study, a messenger app developer, while evaluating possible message responses, found an association bias between occupation and gender. Responses to, “Did the engineer finish the project?” swayed toward “Yes, he did.” In this case, even though males at this time are dominant in the engineering field, a user viewing those available responses might have a negative experience.

 

Tools for mitigating these human derived biases, such as the WEAT (Word Embedding Association Test) scores, are relatively narrow in scope but are a good start to bringing awareness to how these existing models behave. New strategies to reduce the occurrence of biases represent a new area of research and the writers of the Google Developers article open the topic up for feedback and discussion.


The Scientific Paper Is Obsolete

 

Today’s scientific papers are a far cry from their early predecessors from the 1600s, yet they serve they same purpose - communicating the results of scientific research. But how well does the standard PDF, journal approved paper handle the complexity that science now embraces through algorithms and immense data sets? Walls of impenetrable text attempting to explain something that cannot be easily grasped in a linear fashion, are increasingly giving way to the development of open source programming environments like Jupyter, which was heavily influenced by the Mathematica Computational Notebook. These alternatives have vastly improved the ability for scientists and researchers to communicate elaborate and dynamic findings.

 

 

If you'd like to get these articles and more delivered straight to your email then be sure to subscribe to our email list.

 

 

Share on Twitter
Share on LinkedIn
Please reload

  • Black Twitter Icon
  • Black LinkedIn Icon

© 2019 Verge Labs Pty Ltd, ABN 14 622 840 877