Algorithme

Partager
" Retour à l'index des glossaires

An algorithm is a well-defined sequence of instructions or rules that provides a solution to a specific problem or task. Originating from ancient civilizations, algorithms have evolved through centuries and are now integral to modern computing. They are designed using techniques such as divide-and-conquer and are evaluated for efficiency using measures like big O notation. Algorithms can be represented in various forms like pseudocode, flowcharts, or programming languages. They are executed by translating them into a language that computers can understand, with the speed of execution dependent on the instruction set used. Algorithms can be classified based on their implementation or design paradigm, and their efficiency can significantly impact processing time. Understanding and using algorithms effectively is crucial in fields like ordinateur[2] science and intelligence artificielle[1].

Définitions des termes
1. intelligence artificielle.
1 Artificial Intelligence (AI) refers to the field of computer science that aims to create systems capable of performing tasks that would normally require human intelligence. These tasks include reasoning, learning, planning, perception, and language understanding. AI draws from different fields including psychology, linguistics, philosophy, and neuroscience. The field is prominent in developing machine learning models and natural language processing systems. It also plays a significant role in creating virtual assistants and affective computing systems. AI applications extend across various sectors including healthcare, industry, government, and education. Despite its benefits, AI also raises ethical and societal concerns, necessitating regulatory policies. AI continues to evolve with advanced techniques such as deep learning and generative AI, offering new possibilities in various industries.
2 Artificial Intelligence, commonly known as AI, is a field of computer science dedicated to creating intelligent machines that perform tasks typically requiring human intellect. These tasks include problem-solving, recognizing speech, understanding natural language, and making decisions. AI is categorised into two types: narrow AI, which is designed to perform a specific task, like voice recognition, and general AI, which can perform any intellectual tasks a human being can do. It's a continuously evolving technology that draws from various fields including computer science, mathematics, psychology, linguistics, and neuroscience. The core concepts of AI include reasoning, knowledge representation, planning, natural language processing, and perception. AI has wide-ranging applications across numerous sectors, from healthcare and gaming to military and creativity, and its ethical considerations and challenges are pivotal to its development and implementation.
2. ordinateur. Un ordinateur est un appareil sophistiqué qui manipule des données ou des informations conformément à un ensemble d'instructions, appelées programmes. De par leur conception, les ordinateurs peuvent effectuer un large éventail de tâches, allant des simples calculs arithmétiques au traitement et à l'analyse de données complexes. Ils ont évolué au fil des ans, depuis les outils de comptage primitifs comme le boulier jusqu'aux machines numériques modernes. Le cœur d'un ordinateur est son unité centrale de traitement (UC), qui comprend une unité arithmétique et logique (UAL) pour effectuer les opérations mathématiques et des registres pour stocker les données. Les ordinateurs disposent également d'unités de mémoire, comme la ROM et la RAM, pour stocker les informations. Les autres composants comprennent des dispositifs d'entrée/sortie (E/S) qui permettent d'interagir avec la machine et des circuits intégrés qui améliorent la fonctionnalité de l'ordinateur. Des innovations historiques majeures, comme l'invention du premier ordinateur programmable par Charles Babbage et le développement du premier ordinateur numérique électronique automatique, l'ordinateur Atanasoff-Berry (ABC), ont grandement contribué à leur évolution. Aujourd'hui, les ordinateurs alimentent l'internet, relient des milliards d'utilisateurs dans le monde entier et sont devenus un outil essentiel dans presque tous les secteurs d'activité.
Algorithme (Wikipedia)

Au mathematics et computer science, an algorithme (/ˈælɡərɪðəm/ ) is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a calcul. Algorithms are used as specifications for performing calculations et data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".

In a loop, subtract the larger number against the smaller number. Halt the loop when the subtraction will make a number negative. Assess two numbers whether one of them equal to zero or not. If yes, take the other number as the greatest common divisor. If no, put the two number in the subtraction loop again.
Flowchart of using successive subtractions to find the greatest common divisor of number r et s

In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result. For example, social media recommender systems rely on heuristics in such a way that, although widely characterized as "algorithms" in 21st century popular media, cannot deliver correct results due to the nature of the problem.

As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.

" Retour à l'index des glossaires
fr_FRFR
Retour en haut