Information technology is a broad field that includes a number of different areas of practice and study, ranging from business and information systems to computer science, computer hardware development, programming, and systems administration. Within the broader fields of information technology, there are other subfields. For example, a computer scientist who focuses on programming may study artificial intelligence or cloud computing.
Computers are all around us today, but information technology means more than simply using computers or being proficient in a specific type of computer software. Those who study a bachelor’s degree in computer science can pursue a career as a computer systems analyst, software developer or data analyst. Jobs can include designing and deploying an information technology system for a wide range of organizations, managing computer networks, or working as a computer programmer developing business or consumer applications.
Because technology is always advancing, those who are interested in a career in information technology need a passion for learning new tools, algorithms, and ways of thinking. The computer system that a person learns on while at university may be vastly different to the one they use in the workplace five or ten years later. The field of information technology is one where continuous learning and professional development is a must.
What information technology encompasses
Information technology is a broad field of study. Often, when a person embarks on an information technology degree they will find themselves exposed to several aspects of the field at a basic level. They can then choose to specialize through postgraduate study or professional qualifications after graduation.
Alternatively, a would-be specialist may choose a degree that focuses on a specific area such as machine learning or computer systems design.
What areas of study are included in information technology?
There are many options when it comes to studying for an information technology degree. Some common bachelor’s degrees include:
A computer science degree is a broad degree for those who are interested in exploring the different options open to a computer scientist. This degree may cover computer hardware and software, networking, information security, and how computers work at a very detailed level. This type of degree includes a lot of theory and what computer scientists call ‘low level’ work. In this case, low level means ‘programming computers by talking to the hardware’ rather than using more human-friendly ways of working with the machine.
Business and information systems
A bachelor’s degree in business and information systems focuses on the more practical applications of computers in the world of business. This degree teaches students about how information systems can help businesses, how databases work, and how to design effective systems. Students also learn about best practices, ethics, data protection, and security.
In some ways, video games are at the leading edge of technology. They push the limits of what computer hardware can do and are often the testing ground for new systems including improved graphics, scaling challenges, augmented reality, artificial intelligence, and even blockchain technologies. Game development degrees often cover programming, design, usability, animation, and the mathematics associated with programming.
Artificial intelligence and machine learning
Artificial intelligence and machine learning are important areas of information technology that have applications in other areas including finance, the core sciences, and even setting public policy. Machine learning is often used in those fields and is increasingly being taught as a module within degrees in those subjects. It is possible to specialize in machine learning, and students will learn about mathematics, probability, and popular tools such as Kubernetes and Tensorflow.
Cloud computing is still a relatively young field in the world of information systems, but there are some universities that are offering courses relating to it. Those who study cloud computing learn about cloud services, Site Reliability Engineering (SRE), scalability, and security. They learn how to manage cloud services and how to develop services that make use of the flexibility and power of the cloud.
What are the main elements of information technology?
There are four main elements of information technology:
- Information security
- Database and network management
- Computer technical support
- Software development
Each of these fields is considered a relatively specialist area, however, there is a lot of overlap. Most business software development would require the use of databases and networking, for example. In addition, where networking is used, information security becomes important.
This means skilled developers will have some fluency in all areas. Studying usability and working alongside technical support specialists helps them make software that is easy to use. Understanding good database designs and how networks work ensures the applications the developers make are robust. Following best practices, such as being aware of the OWASP top 10 vulnerabilities for web applications helps improve application security.
How can you work toward a career in information technology?
The field of information technology is a very competitive one. It’s also one where remote working is often possible, and this means many big companies outsource their information technology work internationally. This makes it an incredibly competitive industry to work in and means that simply having a bachelor’s degree is not always enough to find a good job.
The job prospects for those with additional certifications or a Masters Degree, however, can be promising. The median pay for a Computer and Information Research Scientist, for example, is $122,840 per year.
The best route to a job in information technology depends on the specific area of the field that you’re interested in working in.
Working towards a job in computer network engineering
Many employers expect network engineers to have applicable certifications and real-world experience. For example, Cisco offers the Cisco Certified Internetwork Expert (CCIE) qualification, and Microsoft has its own vendor-specific certifications, including Azure Administrator Associate and the Microsoft 365 Certified Enterprise Administrator Expert.
Those who wish to deploy and configure networks may find it beneficial to pursue Cisco’s qualification or a similar qualification from another vendor. Those looking to work on the system’s side may find the Microsoft (or equivalent Linux) qualifications useful.
At the entry-level, qualifications such as the A+ show that someone has a basic level of computer systems literacy.
Working towards a job in information security
Information security is a broad field that includes both maintaining security and testing for vulnerabilities. In the industry, those who work to maintain security are known as “Blue Team” engineers, while those who work to find vulnerabilities and engage in penetration testing are known as “Red Team” engineers.
Security specialists working on the network level will learn about firewalls, proxies, routers and related services. Developers with a focus on security work on looking for bugs in code that could allow an attacker to gain access to a system.
Again, after graduating with a degree in computer science, earning a qualification such as the EC-Council’s Certified Ethical Hacker award could be a good starting point. While this is a European qualification, it is accepted internationally.
Working towards a job in computer systems design
Most information systems and computer science degrees give learners a basic understanding of computer systems design, but there are many development practices used in the real world that a student may not get exposure to.
AGILE development, SCRUM, and other iterative development practices are popular in this fast-moving industry. Would-be designers should familiarize themselves with these practices, their workflows, and terminologies. One good option is to join a small open source project and work with developers, designers, translators, and UI experts to finish a product or improve something that has already been released so that it can be added to a portfolio.
Working towards a job in computer programming
A robust portfolio is essential for anyone who is looking for a job in computer programming. Computer science and information technology degrees often put a heavy emphasis on algorithms and achieving specific goals using code, but wait until the final year project to make a finished product.
Anyone wishing to pursue a job as a computer programmer should start working on projects independently in their spare time. A combination of individual, simple projects and submissions to open source projects is a good idea. Most developers open accounts on GitHub.com which they can use to share their contributions. Prospective employers can check an applicant’s GitHub profile to see what contributions they have made to projects and also verify the quality of their code.
Working towards a job in tech support
Tech support is perhaps the easiest field to get into. Pay for computer support specialists is lower than for some other information systems related jobs, averaging $54,760 per year, and this would include those who are second-line or even more experienced. A computer support specialist offering support for an obscure, old or complex information system will likely earn more than someone supporting consumer technology.
Fortunately, entry-level jobs are widely available and are usually open to anyone who demonstrates competency in the product or service being supported and who has good communication skills. Once a person has a job with a tech support provider, the opportunity to receive training and be promoted internally becomes available.
Tech support prospects can vary significantly, however. Someone supporting end-users with their Microsoft Windows may find their opportunities for progression and movement a little more limited than someone working in a cloud services environment where support requirements are more complex.
Why information technology is important
Information technology is a fascinating and important industry and one that has an impact on almost every other sector. Today, we rely on computers for so many parts of our day-to-day work, entertainment and social lives.
We use information technology every time we access social media, or even watch a show on our favorite streaming service. It’s computers that allow us to buy products online, send emails, and access news services online. Even when we buy from a bricks and mortar store, it’s highly likely the point of sale system or till in the store is powered by a computer.
Computers are used to track the goods that we buy from the time they’re on the production line to when they’re sent to the warehouse, and then the store. Information technology is used by researchers to design new products and even new medicines. Our doctors use a computerized system to log our medical records.
Artificial intelligence is used to predict the flow of traffic and make town planning decisions. It’s used in weather forecasting, and in researching new drug treatments. Cloud computing allows the services we rely so heavily on to scale and be just as reliable with one million users as they were when they were new and had just one thousand.
When those systems are working, we don’t notice them. It’s computer scientists that create those systems, and that keep them working well for us.
Trends and developments in information technology
Information systems can be used anywhere there is data to be processed. Whether that’s storing the bits and bytes that make up the images and sound of your favorite TV show, or logging information about your tax return.
The broad use of computers means there are developments happening on several fronts:
Cyber security and encryption
That data processing comes with many challenges, not least of which is cybersecurity. It may not be particularly important if a malicious person gains access to a social platform where users share photographs of their cats, but those users would still have a right to feel upset about the invasion of their privacy.
Data breaches involving medical records or financial information have far more serious implications. The ubiquity of information systems and the sheer volume of data stored in them today means cyber warfare is becoming incredibly common. Cyber attacks aren’t just carried out by malicious individuals, but by groups looking to make significant sums of money or even influence political events. The Center for Strategic and International Studies lists several recent cyber attacks, showing how significant the war on the digital front is becoming.
The power and flexibility of cloud computing mean artificial intelligence is open to far more people than it was even a few years ago. Historically, only researchers with access to university mainframes could do artificial intelligence-based work, but now anyone with access to a computer and the Internet can use cloud services to build incredibly complex machine learning models.
These models can be used for facial recognition, customer support bots, financial trading algorithms or even modeling call volumes for a contact center. Computers are capable of processing huge amounts of data and using that data to help humans make better decisions.
Augmented reality is one of the most exciting trends in computer science and is one area where many information technology jobs may open up in the next few years.
Computer gaming fans may be familiar with products such as Ingress and Pokemon Go, which use augmented reality to engage their players. These games rely on GPS technology to turn the real world into a game map and use the player’s phones to overlay game items into the real world.
While it’s gaming that is leading the charge in this area, there are more serious applications for augmented reality too. Computer technology is improving rapidly and the computer hardware required to make AR products work is becoming more affordable and mainstream. This means it may not be long before we see AR being used in education and the workplace.
Many companies, including giants such as Microsoft, are working on AR tools for use in education and training. Imagine an engineer is working on a complex piece of machinery at a remote site, and he needs assistance. Using an AR visor, he could transmit ‘his view’ of the machine to an expert located off-site, potentially even in another country. That expert can manipulate a virtual reality version of the machinery and the AR visor will project the expert’s movements so the engineer can see them.
Instead of the engineer and expert having confusing conversations about the location of specific parts or wires, AR allows them to collaborate as if they’re in the same room.
This is still a new technology, and AR headsets are expensive and cumbersome, but they are getting lighter, more reliable, and more affordable every year. Those who know how to program for them and use them alongside modern communication technology will find many job opportunities in the next few years.
Educational options for people interested in information technology
The information technology field is competitive. We have already discussed some of the challenges job seekers are faced with when entering this field. It’s important would-be students consider whether the course they are evaluating is truly relevant to their future careers.
What are the types of information technology programs?
Information technology programs can be broadly divided into three categories:
Information systems degrees
Information systems degrees focus on the application of technology, usually in business. This type of qualification is a good option for someone who is considering studying for an MBA and moving into a more managerial or executive role later in their career.
Information systems degrees focus on using and maintaining systems, rather than designing and coding them. Most information systems degrees include some coding and an explanation of data structures and networking since this knowledge is important for creating databases and business applications, but the degrees do not go into as much depth about how computers work.
Computer science degrees
Computer science degrees are more theoretical than information systems degrees, focusing on algorithms, mathematics, and the low-level details of how computers and networks work. Computer science is useful for people who wish to go into software or hardware development.
Because the skills learned on a computer science degree are so theoretical, it may be hard to transition directly into the world of business. A would-be computer scientist who wishes to work in a specific industry may need to study a master’s degree applicable to that industry.
The bachelor’s degree and higher education route is not the only way to gain employment in the world of computer science. Some people opt to skip the degree program entirely and focus on real-world experience, a portfolio, and certifications.
It’s important to research certification providers carefully if going down this route. Vendors such as CompTIA, Microsoft, and Cisco are recognized internationally and have a good reputation in the industry. There are many other providers who are not as trusted. Be wary of providers promising cheap and easy certifications, since it’s unlikely any major employers would accept them.
What should you consider when choosing an IT degree?
Learners should take care to choose an accredited university for their IT degree. There are some providers that have a very good reputation for their IT programs, such as the University of South Florida Sarasota-Manatee, but there’s still value to a degree from a smaller university as long as it is accredited.
Check the course content of the degree. How much time is devoted to the specific areas of study (such as web frameworks, cybersecurity, or artificial intelligence) that you are interested in? How much time is spent on practical and personal projects?
Work experience is essential for many IT jobs. A degree that includes a work placement year and has a major final project will carry more weight with corporate employers than one that is purely academic.
What is the highest level of education required for a job in IT?
The entry path for IT jobs can vary dramatically. Most people will start with a bachelor’s degree then get an entry-level job as a software developer or a junior database administrator. From there, they can gain the experience required to move on to more senior analyst or administrator jobs.
Depending on the employer, there may be an opportunity for on-the-job training and certifications. Since many employers are heavily invested in a particular vendor’s ecosystem it makes sense for them to invest in their employees with training for those products. For example, a database administrator who works with Microsoft’s cloud services may have the opportunity for on-the-job study and Azure certifications.
An information security analyst may still be expected to have a bachelor’s degree simply to show they have a broad understanding of the field and have passed a course of study that requires some academic rigor. The key consideration for getting the job, however, is knowledge of current tools, operating systems, and platforms.
Career opportunities in information technology fields
Once someone has a foundation in information technology they can progress into a number of specializations. Salaries vary massively depending on the field that a person moves into, and also the city they live in.
What is the average starting salary of someone with an IT degree?
Software engineers, and in particular those who are classed as ‘full-stack’ developers capable of working with the front-end and back-end of web development systems, are in the most demand as of 2020. Individuals with those skills can expect to command salaries of around $100,000 or more after a few years in the workforce. Starting salaries are lower than this, however.
The average salary for a computer systems analyst is $61,652. Business analysts and project managers can expect to earn more. Those who opt to focus on cybersecurity have the option of consulting, which may give them higher earnings opportunities than someone who is employed for a single company.
Customer support specialists usually earn less than those in more technical roles, but this is reflected by the lack of a degree requirement for many customer support roles. The in-house training opportunities offered for these roles can be appealing for those who are considering a career switch.
How can you strengthen your resume for jobs in IT?
The competitive and ever-changing nature of the computing field means it’s important to work on your resume. Find out what tools are the most popular in the area where you are seeking employment, and take the time to learn them.
Vendor certifications are a good way of boosting your resume, but they can be expensive. There are some free and low-cost certifications offered by Google that may be of value for entry-level IT workers and those who are looking for a career change.
Contributing to open-source projects is perhaps the best option, however. Doing this builds a social network of people who are likely already working in the IT field, and also helps you gain experience with tools, frameworks, and languages that are in use in the real world. The contributions you make can be seen by prospective employers and add to your portfolio, making open source work a good way of improving your resume and learning new skills at the same time.
Many of the most popular open-source projects are run by companies. For example, Canonical maintains Ubuntu, and Red Hat maintains CentOS. Working on the free software could provide a foot in the door for a would-be systems engineer to get a job with a major tech giant. This, of course, is a long-term goal but it is something many engineers have already managed to achieve, and even if it doesn’t happen, the experience is useful for other jobs.