Information for Business from Lenovo
Performance + Productivity
Contributor: Graeme Philipson
Artificial intelligence comes of age

Artificial intelligence (AI) has been one of the dreams of computing since the popular press dubbed the first primitive mainframes “electronic brains” back in the 1960s. Now, it seems, that dream is coming true.

Driven by massive advances in computing power, AI is becoming mainstream. The term is not as widely used as it once was, but that is only because the concepts it describes have become so ubiquitous.

Generally speaking, AI, sometimes called cognitive computing, is used in applications that demand the kind of processing that human brains are capable of, but very often at much higher speeds.

AI is employed in many applications:

  • Neural networks: Where computers are used to attempt to simulate human thought processes and learn from experience.
  • Semantic computing: The use of AI to analyse unstructured textual data, looking for patterns across thousands or even millions of documents, to analyse content and information.
  • Language translation and natural language processing: Increasingly, raw computing power is used to match examples of words and phrases in different languages, rather than attempt algorithmic translations.
  • Virtual reality: Computer-simulated reality that attempts to digitally recreate an actual or imaginary environment, and to have humans interact with it.
  • Cybersecurity and encryption: Pattern matching, one of the human brain’s most complex skills, can be emulated by raw computing power to unravel cryptographic secrets.
  • Autonomic computing: Self-managing computer systems that are capable of diagnosing and repairing themselves.
  • Robotics: Human-type robots are still a long way off, but increasingly AI techniques are being used to help industrial robots perform complex tasks, and to develop multi-purpose robotic devices.

Can computers think?

AI is hot. Applications based on AI technologies are becoming commonplace. A key debate in the field is whether computers can actually ‘think’. This is sometimes extended to a discussion about whether it is possible for a machine to have a soul, or even be alive. These are metaphysical discussions best left to philosophers – the key question to most people working in AI is whether human thought can be simulated.

If a computer does the same job as the human brain, or its work is indistinguishable from that of the human brain, the argument goes that the question of ‘thinking’ becomes immaterial. The best statement of this idea, which dates back to the very birth of electronic computers in World War II, is the Turing Test.

Alan Turing, the genius behind the Colossus machine that broke the German Enigma codes and helped win the war, postulated that if you held a conversation with a computer and you could not tell if you were talking to a machine or a human being, then the computer would match the intelligence of a human. If it walks like a duck…

One of the most important thinkers about AI is Ray Kurzweil, an American inventor and one of the fathers of speech recognition. His book The Age of Spiritual Machines is one of the key works on the subject, but more important is his idea of the ‘singularity’.

Kurzweil believes that advances in AI and computing power will mean that within 20 or 30 years computers will be more intelligent than humans, and will be capable themselves of designing even more intelligent machines. He is an optimist and believes that humans and computers will cooperate and build a marvellous future.

Others see computers as competing with humankind as the dominant life force of the planet. Stories based on this rivalry have been a staple of science fiction for decades.

The power of AI today

In the meantime, there is no doubt that AI-based systems are breaking new barriers in performance and versatility. More than half a century after the concept was first explained, Moore’s law remains in force and ensures the continued growth in computing power that is driving the AI boom.

A good example of AI, using Lenovo hardware, is Bit Theory. It uses a software engine called Athena to develop computer animations used in video and film productions.

It was developed in partnership with Lenovo, Autodesk and Intel, and is powered by a cluster of 30 Lenovo ThinkStation D20 dual quad-core workstations that together have more computing power than the largest supercomputer of only a decade ago.

Athena uses AI techniques to automate key parts of the animation process. It processes textual instructions for the animation, and if doesn’t understand something, it presents a series of question marks around unknown words or phrases, and can then either receive a more detailed description or a link to a visual object.

As processing power continues to improve, AI has an infinite future. The range of applications continues to increase, much as the human brain has continued to evolve. But it is happening much more quickly, as Ray Kurzweil has predicted.

The challenge is to harness its capabilities effectively.

Recommended articles
What will open government data mean for innovation and productivity?
Darren Baguley
Top 10 technology trends shaping business today (ebook)
Turning operational constraints into business innovation
Mark Pesce
Investing in technology boosts productivity and growth
Investing in technology boosts productivity and growth
Five things businesses need to know about network-attached storage
Five things businesses need to know about network-attached storage
Andrew Storrier
Five work culture hacks for a happier workplace
Five work culture hacks for a happier workplace
Speak to A Lenovo Business Solution Specialist Today.