
Moore's Law is a term used to describe the increase in computing power over time. Moore's Law was introduced in 1965 by Gordon Moore. Moore's Law is gauged by the maximum number of transistors on a microprocessor or memory chip at any given point in time. He stated that the number of transistors on an integrated circuit will double approximately every two years.
Whether held up by the accuracy of Moore's visionary prediction or driven by the industry’s thirst to keep up with the trend in technology, Moore's prediction has proven to be true for almost five decades. This exponential computing trend has affected all avenues of life including (by not limited to) the following: personal computers, communications, transportation, navigation, agriculture, medicine, world finance, education, and social media. Some industry experts believe Moore's Law will reach a fundamental limit within the next few decades, while others expect a revolution in microprocessor technology to maintain the trend.
Many visionaries feel that we humans will soon reach a point in our existence that can be described as unpredictable and maybe even unsettling. This point in history (if the predictions are true) will be the result of an intelligence explosion caused by a technological singularity. A technological singularity is a point in human history where life or even existence after an event based on technological progress is unpredictable or incomprehensible.
These are only predictions though. The fact of the matter is this: we don't know. We don't know enough about consciousness to apply the concept to a machine (yet) if it is even possible...
This course includes: