Devices have been used to aid computation for thousands of years, probably initially in the form of a tally stick. The Antikythera mechanism, dating from about the beginning of the first century BC, is generally considered to be the earliest known mechanical analog computer, and the earliest known geared mechanism. Comparable geared devices did not emerge in Europe until the 16th century,and it was not until 1645 that the first mechanical calculator capable of performing the four basic arithmetical operations was developed.
Around 1980, computers with integrated circuits on a single chip started to become available as an affordable item, specifically designed for use by consumers. The first IBM (News- Alert) PC was released in 1981. Graphical User Interfaces (GUIs), which are the programs that enable users to look at screens with windows, clickable commands and color—appeared in the late 1980s. Windows was introduced in 1983, while the first Apple Macintosh came on the market in 1984. Fifteen years ago, a computer with 24 megabytes of random access memory (RAM (News – Alert)) was great amount of memory. Currently, most new computers boast a memory of one gigabyte, which is an increase of over forty times. Information technology has also shifted from focusing on single computers to networks of computers, as the years have passed and technology has advanced. Networked computers allow many different users to have access to common databases, which has allowed databases to be where most business records for big companies are held. Software that implements these databases has quickly become a billion-dollar industry. Customer Relationship Management (CRM) software drives business activities for many large enterprises who sells products across the world.