The Latest Trends in Computer Science
1 - Quantum computing: Moore’s law is failing[1].
Both physically and economically we are at the end of our silicon rope.
We are almost at the physical limit of how many transistors we can cram
into a chip. This is why over the last few years it has been all about
adding cores and better load distribution rather than architecture. You
can do so much with silicon and classical computing. To preserve the
long term exponential growth of computing power per dollar cost we need a
paradigm shift. That paradigm shift is coming from quantum computing.
However using qubits instead of bits has its own limits. Quantum
computers are very good at somethings and very very inept at others.
Over the medium term (10–15 years) most advances will come from
designing hybrid systems that combine quantum and classical components.
This will require hitherto non existing computer science jobs and
topics. If I were 20 I would definitely devote some serious effort to
follow quantum computing innovations.
2 - Deep Learning and Natural Language Processing:
These two are in fact indispensable for building a system that can
apply a general learning ability (we are not there yet) using existing,
non-structured (i.e. specifically designed to be input into the system)
sources of information (we are halfway there). If we are to move towards
general artificial super intelligence we will have to build a system
that can do unsupervised learning on multiple domains without explicit
interference from programmers. Much effort will be spent to develop
learning algorithms that are not designed for specific problems but
general information processing.
3 - Data visualization: This
will sound strange to most but in my opinion existing data
visualization practices are well behind our ability to make inferences
from data. Many good routines exist but many are still very labor
intensive. We are yet to develop an intelligent interface between
analytical inference and visual representation of that inference. I am
not an expert on this but in the next 10 years we might witness dramatic
changes in how we translate data analysis results to end users.
4 - Autonomous systems:
Self driving cars, homeostatic control systems for your house, health
monitoring implants, robotic asteroid miners, self-replicating robots
for space exploration and the like. Internet of things is here but it
has still not exploded. A lot of room for rapid development is on the
horizon. People who can design efficient and seamless communication and
coordination between the “things” in the internet of things will be high in demand.
5 - Neural interfaces between digital and biological systems:
As Elon Musk says the main bottleneck in brain - computer communication
is human output. Our input system (visual cortex) is incredibly high
bandwidth. We can absorb immense amounts of information through our
visual (and auditory) systems. The problem is output (or input to the
digital system). We have a very precise but very inefficient meat stick
method (i.e. typing commands) and a much faster but more difficult to
process vocal system (speech recognition). Speech recognition has
reached impressive levels but we are still way short to match our own
input bandwidth with our output bandwidth. In order to match our visual
information processing capacity we need direct neural interfaces with
computers. Systems that can interpret neural signals as information
output. Some rudimentary success have been reported over the last few
years but real development is still ahead. I
think we can probably think many more future topics of computer
science. If we go into details I think we will see some
interdisciplinary cross-breeding between medicine and computer science
(very much similar to what happened in genomics and bioinformatics), and
probably many more that few people can predict.
Posted By
P.Revathi,M.Sc.,M.Phil.,
Department of Computer Science
MKJC, Vaniyambadi.