Artificial Intelligence

 Although the concept of artificial intelligence has been used since the 1950s, this concept encourages computer scientists to develop new and increasingly complex technologies and creates excitement for those who use these technologies in daily life. It is a set of techniques that enable computers to imitate human behavior. Artificial intelligence studies, which have been a factor in pushing the limits of imagination in recent years, appear before people with innovations every day.



In recent years, the world’s leading IT sector organizations such as Microsoft, Google, Apple, Facebook have announced that there is no longer a mobile-first world. Instead, they stated that it is the AI-priority world where digital assistants and other services will be the primary source of information and perform their tasks. In fact, the United Arab Emirates established a Ministry of Artificial Intelligence, breaking a new ground in the world. The areas of artificial intelligence are increasing day by day, and the artificial intelligence universe is expanding further over time.


While it is possible to think about many different techniques to understand and use AI, and while there are many different ways, the key to machine intelligence is that it has to feel, be logical, and act, and then adapt based on experience [1].

  • Meaning: Identifying and recognizing meaningful objects or concepts in the middle of big data. For example, to distinguish between a vehicle tail light or a tumor / normal tissue.
  • Reason: Understanding the wider context and making a plan for achieving a goal. For example, if the goal is to prevent a collision; The vehicle must calculate the crash probability based on vehicle behavior, proximity, speed and road conditions.
  • Aciton: Suggesting the best course of action or starting directly. For example, based on vehicle and traffic analysis, it can brake, accelerate or prepare safety mechanisms.
  • Adaptation: Finally, at every stage, algorithms should be adapted based on experience and tried to be made smarter. Recognizing more blind spots, including new variables in context and adjusting actions based on previous events, etc.

Hospitals, clinics, and medical and research institutes produce a large amount of data on a daily basis, including laboratory reports, imaging data, pathology reports, diagnostic reports, and drug information [2]. The rapid expansion of healthcare data is one of the biggest challenges for physicians. The current literature shows that big data and artificial intelligence solutions are solutions for handling this big data explosion as well as meeting social, financial and technological demands in healthcare. Analyzing such large and complex data is often difficult and requires a high level of skill for data analysis. The most difficult is the interpretation of results and recommendations based on medical experience. These require years of medical studies, knowledge and special skills [3].

In the healthcare field, data is produced, collected and stored in many formats, including digital, text, images, scans, sounds and videos. If artificial intelligence is to be applied to the data, first of all, the quality of the data and all the questions to be answered from the target data set should be understood. The data type helps formulate the neural network, algorithm and architecture for artificial intelligence modeling [3].

According to researches, the artificial intelligence market in the field of health can reach $ 6.6 billion, a growth rate of 40%, by 2021, and may have the potential to reduce treatment costs by up to 50% [4]. Artificial intelligence studies have the potential to save 150 billion dollars annually in the health sector until 2026 [5]. Effective solutions provided to the healthcare industry using artificial intelligence are constantly evolving:

  • Tools to ease the burden on clinics and enable medical professionals to do their jobs more effectively,
  • Filling the gaps in health services during the increasing need for labor,
    Improving efficiency, quality and results for patients,
  • Integrating health data on various platforms and expanding the access network,
  • Providing more efficiency, transparency and interoperability advantages,
  • Ensuring information security.

The usage areas of artificial intelligence are discussed in detail in the following posts.

[1] Singh, N. (2018). How to Get Started as a Developer in AI. Web: https://software.intel.com/en-us/articles/how-to-get-started-as-a-developer-in-ai

[2] Marr, B. (2017). How AI And Deep Learning Are Now Used To Diagnose Cancer. Web: https://www.forbes.com/sites/bernardmarr/2017/05/16/how-ai-and-deeplearning-is-now-used-to-diagnose-cancer

[3] Saraf, M. K., and Mike, P. (2018). Artificial Intelligence and Healthcare Data. Web: https://software.intel.com/en-us/articles/artificial-intelligence-and-healthcaredata

[4] Sullivan, F. (2016). From $600 M to $6 Billion, Artificial Intelligence Systems Poised for Dramatic Market Expansion in Healthcare. Web: https://ww2.frost.com/news/press-releases/600-m-6-billion-artificial-intelligencesystems-poised-dramatic-market-expansion-healthcare

[5] Collier, M., Fu, R., Yin, L., and Christiansen, P. (2017). Artıfıcıal Intellıgence: Healthcare’s New Nervous System. Web: https://www.accenture.com/t20171215T032059Z__w__/us-en/_acnmedia/PDF-49/Accenture-Health-Artificial-Intelligence.pdf

[6] Savaş, S. (2019), Karotis Arter Intima Media Kalınlığının Derin Öğrenme ile Sınıflandırılması, Gazi Üniversitesi Fen Bilimleri Enstitüsü Bilgisayar Mühendisliği Ana Bilim Dalı, Doktora Tezi, Ankara.

[7] Türkiye Cumhuriyeti Cumhurbaşkanlığı Dijital Dönüşüm Ofisi — CBDDO

Hiç yorum yok:

Yorum Gönder