Guest Feature

Guest blog: How artificial intelligence is changing and challenging the tech industry

By Business & Finance
11 January 2018
technology processor stock

Viktor Kovacevic, Vice-President and General Manager of Comtrade Digital Services, talks about how artificial intelligence (AI) is changing and challenging a wide range of industries.

viktor kovacevic

Viktor Kovacevic

Comtrade Digital Services, like many others in software development and quality assurance, has committed to investing in artificial intelligence (AI) as a part of its growth strategy to serve its clients’ future needs. As a recognised thought leader in digital transformation, Comtrade Digital Services invited a number of experts from their respective industries to share their views on this fast-evolving area at its Quest for Quality Conference in Dublin, Ireland.

Considering that a previous report from market research firm Tractica forecasted that the annual global revenue for AI products and services will grow from $643.7 million in 2016 to $36.8 billion by 2025, it’s safe to say that this way of working signifies the next big technological shift and will become commonplace across industries such as financial services, healthcare and advertising.

In fact, the general consensus is that most companies have already started implementing an AI programme but have seen mixed results. However, some organisations are seeing material success. Take CERN for example, it’s the European Organization for Nuclear Research and one of the world’s largest centres for scientific research.

Currently, it is utilising an AI programme to analyse, record and process around 600 petabytes of data coming from large detectors in the Large Hadron Collider (LHC). Moreover, it makes this information publication-ready.

Chief Research Officer at CERN, Fons Rademakers revealed: “At CERN, we have our own systems that we benchmark industry products against. We continue to use and keep improving many of our own algorithms, but this method keeps both parties honest and leads to progress.”

One area of concern noted by the panel of experts was that of quality assurance (QA). While QA is the backbone of reliability when implementing any new product, technique or system, it doesn’t truly exist in the AI engineering process. This is an issue that requires close attention and customisation of traditional processes and practices.

The fact that a system is sophisticated and machine intelligent doesn’t necessarily mean that it can be trusted or should be left unmonitored. Developing a way of testing AI systems is thus imperative if they are to become a reliable resource for companies across all industries. This is made more complicated by the fact that such learning systems evolve over time, meaning that the output for a given input will not always be the same.

Vincent Lonij of IBM Research Ireland explained: “It’s important to figure out how you can test if the way it’s behaving is the way it’s supposed to be behaving. The system from a statistical perspective should still reach certain benchmarks. Rather than one input, give it a million inputs and make sure that, on average, it reaches all the benchmarks you know it’s supposed to be able to reach.”

According to Dr Robert Ross of the Dublin Institute of Technology, “It’s about creating conditions whereby we can test, validate and investigate the algorithm, like we would any other software component.”

Digital Services strategist Albert Eng commented: “It should be obvious that, even with the advancement of AI technology and engineering, QA is not guaranteed, and when AI-based techniques are introduced from the research and development component of an organisation to the mainstream part of it, new practices have to be applied to ensure that it is performing adequately and accurately. In fact, AI and machine learning has complicated the QA process greatly.”

Another topic of discussion within the area of AI is ‘big data’, specifically the quality and impact of it. Companies are working with real data, sometimes structured and spread across different silos within the organisation. Not only is there the challenge of figuring out what the data shows but also the value of it, what the companies are looking for and how it can be used.

Big data further complicates the QA process, as does the acquisition of new realms of unstructured information across the Internet. As Eng challenged our distinguished panellists, all of them agreed that the enterprise data warehouse still has value in the totality of big data, but it becomes a less important source of information for new correlations that AI and machine learning systems will discover.

For now, AI and machine learning are benefitting certain companies and industries to some degree, enabling more refined and efficient ways of working. Furthermore, there are hopes of greater progress in the areas of autonomous vehicles and intelligent personal assistants. The possibilities, of course, don’t end there with talk of human brain interfaces and more sophisticated modes of interaction with computer systems.

However, there is still a great deal of research and work to be done. Not only is it currently impossible to pick which platforms are doing it best, but a lot of the major hurdles associated with AI will need to be resolved. This will have to happen before AI becomes an integral, accurate and reliable part of life. But the journey into AI has to start now.

Comtrade Digital Services is a provider of strategic software engineering services and solutions. The organisation enables companies across different industries to innovate faster and reinvent their business models digitally, by using agile development methodologies, innovative technology and business acumen.