You might have an algorithm for getting from office to home, for making a chunk of code that calculates the terms of the Fibonacci sequence, or for finding what you’re looking for in a retail store. Algorithms are the building blocks of computer programs or sequence of unambiguous instructions ( the term 'unambiguous' indicates that there is no room for subjective interpretation) that tells how the problem could be addressed and solved -- which is definitely overblown in their importance like road maps for accomplishing a given, well-defined automated reasoning task -- which always have a clear stopping point.
Long division and column addition are examples that everyone is familiar with -- even a simple function for adding two numbers is implementation of a particular algorithm. Online grammar checking uses algorithms. Financial computations use algorithms. Robotic field uses algorithms for controlling their robot using algorithms. An encryption algorithm transforms data according to specified actions to protect it. A search engine like Google uses search engine algorithms (such as, takes search strings of keywords as input, searches its associated database for relevant web pages, and returns results). In fact, it is difficult to think of a task performed by your computer that does not use computer rules that are a lot like a recipes (called algorithms).
The use of computer algorithms (step-by-step techniques used for Problem-solving) plays an essential role in space search programs. Scientists have to use enormous calculations, and they are managed by high-end supercomputers, which are enriched with detailed sets of instructions that computers follow to arrive at an answer. Algorithms have applications in many different disciplines from science to math to physics and, of course, computing -- and provide us the most ideal option of accomplishing a task. Here is some importance of algorithms in computer programming.
- To improve the effectiveness of a computer program: An algorithm (procedure or formula for solving a problem, based on conducting a sequence of specified actions) can be used to improve the speed at which a program executes a problem and has the potential of reducing the time that a program takes to solve a problem.
- Proper usage of resources: The right selection of an algorithm will ensure that a program consumes the least amount of memory. Apart from memory, the algorithm can determine the amount of processing power that is needed by a program.
The algorithm for a child's morning routine could be the following:
- Step 1: Wake up and turn off alarm
- Step 2: Get dressed
- Step 3: Brush teeth
- Step 4: Eat breakfast
- Step 5: Go to school
The algorithm to add two numbers entered by user would look something like this:
- Step 1: Start
- Step 2: Declare variables num1, num2 and sum
- Step 3: Read values num1 and num2
- Step 4: Add num1 and num2 and assign the result to sum
- sum ← num1 + num2
- Step 5: Display sum
- Step 6: Stop
Two of these algorithms accomplish exactly the same goal, but each algorithm does it in completely different way to achieve the required output or to accomplish our task. In computer programming, there are often many different ways – algorithms (any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values as output) -- to accomplish any given task. Each algorithm has credits and demerits in different situations. If you have a million integer values between -2147483648 and +2147483647 and you need to sort them, the bin sort is the accurate algorithm to use. If you have a million book titles, the quick sort algorithm might be the best choice. By knowing the toughness and weaknesses of the different algorithms, you pick the best one to accomplish a specific task or to solve a specific problem.
One of the most important aspects of an algorithm is how fast it can manipulate data in various ways, such as inserting a new data item, searching for a particular item or sorting an item. It is often easy to come up with a list of rules to follow in order to solve a problem, but if the algorithm is too slow, it's back to the drawing board. Efficiency of an algorithm depends on its design and implementation. Since every procedure or formula for solving a problem based on conducting a sequence of specified actions -- uses computer resources to run -- execution time and internal memory usage are important considerations to analyze an algorithm.
Why Study Algorithms?
Algorithms are the heart of computer science (usually means a procedure or basically instance of logic written in software that solves a recurrent problem of finding an item with specific properties among collection of items or transforming data according to specified actions to protect it), and the subject has countless practical applications as well as intellectual depth that is widely used throughout all areas of information technology including solving a mathematical problem (as of finding the greatest common divisor ) in a finite number of steps that often involves repetition of an operation. The word algorithm -- a mathematical concept whose roots date back to 600 AD with invention of the decimal system -- derives from the name of the ninth century Persian mathematician and geographer, Mohammed ibn-Musa al-Khwarizmi, who was part of the royal court in Baghdad and who lived from about 780 to 850. On the other hand, it turns out algorithms (widely recognized as the foundation of modern computer coding) have a long and distinguished history stretching back as far as the Babylonians.
Although there is some available body of facts or information about early multiplication algorithms in Egypt (around 1700-2000 BC) the oldest algorithm is widely recognized to be valid or correct to have been found on a set of Babylonian clay tablets that date to around 1600 - 1800 BC. Their exact significance only came to be revealed or exposed around 1972 when an American computer scientist, mathematician, and professor emeritus at Stanford University Donald E. Knuth published the first English translations of various Babylonian cuneiform mathematical tablets.
Here are some short extracts from his 1972 manuscript that explain these early algorithms:-
"The calculations described in Babylonian tablets are not merely the solutions to specific individual problems; they are actually general procedures for solving a whole class of problems." - Pages 672 to 673 of "Ancient Babylonian Algorithms".
The wedge-shaped marks on clay tablets also seem to have been an early form of instruction manual:-
"Note also the stereotyped ending, 'This is the procedure,' which is commonly found at the end of each section on a table. Thus the Babylonian procedures are genuine algorithms, and we can commend the Babylonians for developing a nice way to explain an algorithm by example as the algorithm itself was being defined...." - Pages 672 to 673 of "Ancient Babylonian Algorithms".
The use of computers, however, has raised the use of algorithms in daily transactions (like accessing an automated teller machine (ATM ), booking an air or train or buying something online) to unprecedented levels of real-world problems with solutions requiring advanced algorithms abounds. From Google search to morning routines, algorithms are ubiquitous in our everyday life -- and their use is only likely to grow to break down tasks into chunks that can be solved through specific implementations. Many of the problems, though they may not seem realistic, need the set of well-defined algorithmic knowledge that comes up every day in the real world. By developing a good understanding of a series of logical steps in an algorithmic language, you will be able to choose the right one for a problem and apply it properly. Different algorithms play different roles in programming – and algorithms are used by computer programs where a program –
- Get input data.
- Process it using the complex logics.
- Stop when it finds an answer or some conditions are met.
- Produce the desired output.
To give you a better picture, here is the most common type of algorithms:
- Searching Algorithms
- Sorting Algorithms
- Path finding Algorithms
- Tree and graph based algorithms
- Approximate Algorithms
- Compression Algorithms
- Random Algorithms
- Pattern Matching
- Sequence Finding and a lot more
You only need to define your problem then select the right algorithm to use.The word algorithm may not appear closely connected to kids, but the truth is that -- for kids -- understanding the process of building a step by step method of solving a problem helps them build a strong foundation in logical thinking and problem solving. Here are some problems you can ask your kid to discuss algorithmic solutions with you:
- How do we know if a number is odd or even?
- How do we calculate all of the factors of a number?
- How can we tell if a number is prime?
- Given a list of ten numbers in random order, how can we put them order?
Algorithms has shown it can yield results in all industries — from predicting insurance sales opportunities and generating the millions of search inquiries every day to automating medicine research, optimizing transportation routes, and much more. While algorithms help companies like Master Card and Visa to keep their users' information, such as card number, password, and bank statement safely -- algorithms aren't perfect. They fail and some fail spectacularly. Over the past few years, there have been some serious fails with algorithms, which are the formulas or sets of rules used in digital decision-making processes. Now people are questioning whether we're putting too much trust in the algorithms. When algorithms go bad: Online failures show humans are still needed. Disturbing events at Facebook, Instagram and Amazon reveal the importance of context.
Timeline of algorithms:
- Before – writing about "recipes" (on cooking, rituals, agriculture and other sorts of themes like willa and Mayan)
- c. 1700–2000 BC – Egyptians develop earliest known algorithms for multiplying two numbers
- c. 1600 BC – Babylonians develop earliest known algorithms for factorization and finding square roots
- c. 300 BC – Euclid's algorithm
- c. 200 BC – the Sieve of Eratosthenes
- 263 AD – Gaussian elimination described by Liu Hui
- 628 – Chakravala method described by Brahmagupta
- c. 820 – Al-Khawarizmi described algorithms for solving linear equations and quadratic equations in his Algebra; the word algorithm comes from his name
- 825 – Al-Khawarizmi described the algorism, algorithms for using the Hindu-Arabic numeral system, in his treatise On the Calculation with Hindu Numerals, which was translated into Latin as Algoritmi de numero Indorum, where "Algoritmi", the translator's rendition of the author's name gave rise to the word algorithm (Latin algorithmus) with a meaning "calculation method"
- c. 850 – cryptanalysis and frequency analysis algorithms developed by Al-Kindi (Alkindus) in A Manuscript on Deciphering Cryptographic Messages, which contains algorithms on breaking encryptions and ciphers
- c. 1025 – Ibn al-Haytham (Alhazen), was the first mathematician to derive the formula for the sum of the fourth powers, and in turn, he develops an algorithm for determining the general formula for the sum of any integral powers, which was fundamental to the development of integral calculus
- c. 1400 – Ahmad al-Qalqashandi gives a list of ciphers in his Subh al-a'sha which include both substitution and transposition, and for the first time, a cipher with multiple substitutions for each plaintext letter; he also gives an exposition on and worked example of cryptanalysis, including the use of tables of letter frequencies and sets of letters which cannot occur together in one word
- 1540 – Lodovico Ferrari discovered a method to find the roots of a quartic polynomial
- 1545 – Gerolamo Cardano published Cardano's method for finding the roots of a cubic polynomial
- 1614 – John Napier develops method for performing calculations using logarithms
- 1671 – Newton–Raphson method developed by Isaac Newton
- 1690 – Newton–Raphson method independently developed by Joseph Raphson
- 1706 – John Machin develops a quickly converging inverse-tangent series for π and computes π to 100 decimal places
- 1789 – Jurij Vega improves Machin's formula and computes π to 140 decimal places,
- 1805 – FFT-like algorithm known by Carl Friedrich Gauss
- 1842 – Ada Lovelace writes the first algorithm for a computing engine
- 1903 – A Fast Fourier Transform algorithm presented by Carle David Tolmé Runge
- 1926 – Borůvka's algorithm
- 1926 – Primary decomposition algorithm presented by Grete Hermann
- 1934 – Delaunay triangulation developed by Boris Delaunay
- 1936 – Turing machine, an abstract machine developed by Alan Turing, with others developed the modern notion of algorithm.
- 1942 – A Fast Fourier Transform algorithm developed by G.C. Danielson and Cornelius Lanczos
- 1945 – Merge sort developed by John von Neumann
- 1947 – Simplex algorithm developed by George Dantzig
- 1952 – Huffman coding developed by David A. Huffman
- 1953 – Simulated annealing introduced by Nicholas Metropolis
- 1954 – Radix sort computer algorithm developed by Harold H. Seward
- 1956 – Kruskal's algorithm developed by Joseph Kruskal
- 1957 – Prim's algorithm developed by Robert Prim
- 1957 – Bellman–Ford algorithm developed by Richard E. Bellman and L. R. Ford, Jr.
- 1959 – Dijkstra's algorithm developed by Edsger Dijkstra
- 1959 – Shell sort developed by Donald L. Shell
- 1959 – De Casteljau's algorithm developed by Paul de Casteljau
- 1959 – QR factorization algorithm developed independently by John G.F. Francis and Vera Kublanovskaya
- 1960 – Karatsuba multiplication
- 1962 – AVL trees
- 1962 – Quicksort developed by C. A. R. Hoare
- 1962 – Ford–Fulkerson algorithm developed by L. R. Ford, Jr. and D. R. Fulkerson
- 1962 – Bresenham's line algorithm developed by Jack E. Bresenham
- 1962 – Gale–Shapley 'stable-marriage' algorithm developed by David Gale and Lloyd Shapley
- 1964 – Heapsort developed by J. W. J. Williams
- 1964 – multigrid methods first proposed by R. P. Fedorenko
- 1965 – Cooley–Tukey algorithm rediscovered by James Cooley and John Tukey
- 1965 – Levenshtein distance developed by Vladimir Levenshtein
- 1965 – Cocke–Younger–Kasami (CYK) algorithm independently developed by Tadao Kasami
- 1965 – Buchberger's algorithm for computing Gröbner bases developed by Bruno Buchberger
- 1966 – Dantzig algorithm for shortest path in a graph with negative edges
- 1967 – Viterbi algorithm proposed by Andrew Viterbi
- 1967 – Cocke–Younger–Kasami (CYK) algorithm independently developed by Daniel H. Younger
- 1968 – A* graph search algorithm described by Peter Hart, Nils Nilsson, and Bertram Raphael
- 1968 – Risch algorithm for indefinite integration developed by Robert Henry Risch
- 1969 – Strassen algorithm for matrix multiplication developed by Volker Strassen
- 1970 – Dinic's algorithm for computing maximum flow in a flow network by Yefim (Chaim) A. Dinitz
- 1970 – Knuth–Bendix completion algorithm developed by Donald Knuth and Peter B. Bendix
- 1970 – BFGS method of the quasi-Newton class
- 1972 – Graham scan developed by Ronald Graham
- 1972 – Red–black trees and B-trees discovered
- 1973 – RSA encryption algorithm discovered by Clifford Cocks
- 1973 – Jarvis march algorithm developed by R. A. Jarvis
- 1973 – Hopcroft–Karp algorithm developed by John Hopcroft and Richard Karp
- 1974 – Pollard's p − 1 algorithm developed by John Pollard
- 1975 – Genetic algorithms popularized by John Holland
- 1975 – Pollard's rho algorithm developed by John Pollard
- 1975 – Aho–Corasick string matching algorithm developed by Alfred V. Aho and Margaret J. Corasick
- 1975 – Cylindrical algebraic decomposition developed by George E. Collins
- 1976 – Salamin–Brent algorithm independently discovered by Eugene Salamin and Richard Brent
- 1976 – Knuth–Morris–Pratt algorithm developed by Donald Knuth and Vaughan Pratt and independently by J. H. Morris
- 1977 – Boyer–Moore string search algorithm for searching the occurrence of a string into another string.
- 1977 – RSA encryption algorithm rediscovered by Ron Rivest, Adi Shamir, and Len Adleman
- 1977 – LZ77 algorithm developed by Abraham Lempel and Jacob Ziv
- 1977 – multigrid methods developed independently by Achi Brandt and Wolfgang Hackbusch
- 1978 – LZ78 algorithm developed from LZ77 by Abraham Lempel and Jacob Ziv
- 1978 – Bruun's algorithm proposed for powers of two by Georg Bruun
- 1979 – Khachiyan's ellipsoid method developed by Leonid Khachiyan
- 1979 – ID3 decision tree algorithm developed by Ross Quinlan
- 1980 – Brent's Algorithm for cycle detection Richard P. Brendt
- 1981 – Quadratic sieve developed by Carl Pomerance
- 1983 – Simulated annealing developed by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi
- 1983 – Classification and regression tree (CART) algorithm developed by Leo Breiman, et al.
- 1984 – LZW algorithm developed from LZ78 by Terry Welch
- 1984 – Karmarkar's interior-point algorithm developed by Narendra Karmarkar
- 1984 – ACORN_PRNG discovered by Roy Wikramaratna and used privately
- 1985 – Simulated annealing independently developed by V. Cerny
- 1985 – Car–Parrinello molecular dynamics developed by Roberto Car and Michele Parrinello
- 1985 – Splay trees discovered by Sleator and Tarjan
- 1986 – Blum Blum Shub proposed by L. Blum, M. Blum, and M. Shub
- 1986 – Push relabel maximum flow algorithm by Andrew Goldberg and Robert Tarjan
- 1987 – Fast multipole method developed by Leslie Greengard and Vladimir Rokhlin
- 1988 – Special number field sieve developed by John Pollard
- 1989 – ACORN_PRNG published by Roy Wikramaratna
- 1990 – General number field sieve developed from SNFS by Carl Pomerance, Joe Buhler, Hendrik Lenstra, and Leonard Adleman
- 1991 – Wait-free synchronization developed by Maurice Herlihy
- 1992 – Deutsch–Jozsa algorithm proposed by D. Deutsch and Richard Jozsa
- 1992 – C4.5 algorithm, a descendant of ID3 decision tree algorithm, was developed by Ross Quinlan
- 1993 – Apriori algorithm developed by Rakesh Agrawal and Ramakrishnan Srikant
- 1993 – Karger's algorithm to compute the minimum cut of a connected graph by David Karger
- 1994 – Shor's algorithm developed by Peter Shor
- 1994 – Burrows–Wheeler transform developed by Michael Burrows and David Wheeler
- 1994 – Bootstrap aggregating (bagging) developed by Leo Breiman
- 1995 – AdaBoost algorithm, the first practical boosting algorithm, was introduced by Yoav Freund and Robert Schapire
- 1995 – soft-margin support vector machine algorithm was published by Vladimir Vapnik and Corinna Cortes. It adds a soft-margin idea to the 1992 algorithm by Boser, Nguyon, Vapnik, and is the algorithm that people usually refer to when saying SVM
- 1995 – Ukkonen's algorithm for construction of suffix trees
- 1996 – Bruun's algorithm generalized to arbitrary even composite sizes by H. Murakami
- 1996 – Grover's algorithm developed by Lov K. Grover
- 1996 – RIPEMD-160 developed by Hans Dobbertin, Antoon Bosselaers, and Bart Preneel
- 1997 – Mersenne Twister a pseudo random number generator developed by Makoto Matsumoto and Tajuki Nishimura
- 1998 – PageRank algorithm was published by Larry Page
- 1998 – rsync algorithm developed by Andrew Tridgell
- 1999 – gradient boosting algorithm developed by Jerome H. Friedman
- 1999 – Yarrow algorithm designed by Bruce Schneier, John Kelsey, and Niels Ferguson
- 2000 – Hyperlink-induced topic search a hyperlink analysis algorithm developed by Jon Kleinberg
- 2001 – Lempel–Ziv–Markov chain algorithm for compression developed by Igor Pavlov
- 2001 – Viola–Jones algorithm for real-time face detection was developed by Paul Viola and Michael Jones.
- 2002 – AKS primality test developed by Manindra Agrawal, Neeraj Kayal and Nitin Saxena
- 2002 – Girvan–Newman algorithm to detect communities in complex systems
Top Algorithms every computer science student should know:
- Depth-First Search
- Binary search algorithm
- Sorting
- Breadth-First Search
- Maze Router: Lee Algorithm
- Flood Fill
- Longest Increasing Subsequence
- Heapsort
- Topological Sort on a DAG
- Union-Find Algorithms
- Minimum Spanning Tree
- Single-Source Shortest Paths
- All Pairs Shortest Paths
- Backtracking
- Greedy Algorithms
- Brute Force
- Divide-and-conquer algorithms
- Recursive Algorithms
- Dynamic Programming
- Randomized Algorithms
- Branch and Bound Algorithms
- Supervised Learning
- Reinforcement Learning
- Unsupervised Learning
- Linked Lists
- Graph Algorithms
- Segment Tree
- Matrix algorithms
- Semi-Supervised Learning
Machine Learning Algorithms:
- Dimensionality Reduction
- Naive Bayes
- Logistic Regression
- Support Vector Machines
- Gradient Boosting
- Decision Trees
- The K-means Clustering Algorithm
- Linear regression
- kNN
- Random Forests
- ECLAT Algorithm
- APRIORI Algorithm
- Association Rule Mining Algorithms
- Clustering Algorithms
- Instance-Based Learning Algorithms
- Artificial Neural Network (ANN)
- Ensemble Methods
- Regularization Algorithms
Books:
-
The Design and Analysis of Parallel Algorithms
Download
Selim G. Akl
-
Computational and Algorithmic Linear Algebra and n-Dimensional Geometry
Download
Katta G. Murty
-
Algorithms Unlocked
Download
Thomas H. Cormen
-
Machine Learning Models and Algorithms for Big Data Classification
Download
Shan Suthaharan
-
Algorithms and Data Structures
Download
N. Wirth
-
Algorithm Design
Download
Jon Kleinberg
-
The Master Algorithm - How the Quest for the Ultimate Learning Machine Will Remake Our World
Download
Pedro Domingos
-
Algorithmic Graph Theory
Download
David Joyner
-
Algorithmics: Theory and Practice
Download
Gilles Brassard
-
Algorithms for Data Science
Download
Brian Steele
-
Algorithms For Dummies
Download
Luca Massaron
-
Algorithms in C
Download
Robert Sedgewick
-
Algorithms in a nutshell
Download
George T. Heineman
-
Algorithms For Interviews
Download
Adnan Aziz
-
Algorithmic Mathematics
Download
Leonard Soicher
-
An Introduction to Genetic Algorithms for Scientists and Engineers
Download
David Coley
-
The Design of Approximation Algorithms
Download
David Shmoys
-
Annotated Algorithms In Python With Applications In Physics, Biology, and Finance
Download
Massimo di Pierro
-
Planning Algorithms
Download
Steven M. LaValle
-
Clever Algorithms
Download
Jason Brownlee
-
Introduction to Algorithms
Download
Thomas H. Cormen
-
Combinatorial Algorithms for Computers and Calculators
Download
Albert Nijenhuis
-
Algorithms to Live By: The Computer Science of Human Decisions
Download
Tom Griffiths
-
Data Mining Algorithms: Explained Using R
Download
Pawel Cichosz
-
Data Structures And Algorithms Made Easy
Download
Narasimha Karumanchi
-
Data Structures and Algorithms in Java
Download
Robert Lafore
-
Data Structures & Problem Solving Using Java
Download
mark allen weiss
-
Data Structures and Algorithms with JavaScript
Download
Michael McMillan
-
Genetic Algorithms
Download
Prem Junsawang
-
Fast Fourier Transform: Algorithms and Applications
Download
K.R. Rao
-
Fundamentals of Computer Algorithms
Download
Ellis Horowitz
-
Analysis of Algorithms: An Active Learning Approach
Download
Jeffrey J. McConnell
-
Graph Algorithms in Bioinformatics
Download
-
Algorithms: Greedy Algorithms
Download
Amotz Bar-Noy
-
Grokking Algorithms: An illustrated guide for programmers and other curious people
Download
Aditya Y. Bhargava
-
Handbook of Scheduling: Algorithms, Models, and Performance Analysis
Download
Joseph Y-T. Leung
-
Computer Algorithms
Download
Sartaj Sahni
-
How to Think About Algorithms
Download
Jeff Edmonds
-
Introduction to Algorithms: A Creative Approach
Download
Udi Manber
-
Introduction to Evolutionary Algorithms
Download
Mitsuo Gen
-
Introduction to Genetic Algorithms
Download
S.N.Sivanandam
-
Introduction to the Design and Analysis of Algorithms
Download
Anany Levitin
-
Knapsack Problems: Algorithms and Computer Implementations
Download
Silvano Martello
-
Machine Learning: An Algorithmic Perspective
Download
Stephen Marsland
-
Mastering Algorithms with C
Download
Kyle Loudon
-
Memory Management: Algorithms and Implementation in C / C++
Download
Bill Blunden
-
Network Routing: Algorithms, Protocols, and Architectures
Download
Deepankar Medhi
-
Neural Networks: Algorithms, Applications, and Programming Techniques
Download
James A. Freeman
-
Numerical Algorithms
Download
Justin Solomon
-
Algorithms and Parallel Computing
Download
Fayez Gebali
-
Pearls of Functional Algorithm Design
Download
Richard S. Bird
-
Problems on Algorithms
Download
Ian Parberry
-
Practical Genetic Algorithms
Download
Sue Ellen Haupt
-
Problem Solving with Algorithms and Data Structures
Download
Brad Miller
-
Randomized Algorithms
Download
Rajeev Motwani
-
Representations for Genetic and Evolutionary Algorithms
Download
Franz Rothlauf
-
The Algorithm Design Manual
Download
Steven S. Skiena
-
Sams Teach Yourself Data Structures and Algorithms in 24 Hours
Download
Robert Lafore
-
The Design and Analysis of Computer Algorithms
Download
Alfred Aho
-
Art of Computer Programming: Volume 1: Fundamental Algorithms
Download
Donald Knuth
-
Understanding Machine Learning: From Theory to Algorithms
Download
Shai Shalev-Shwartz
-
Algorithms and Data Structures for External Memory
Download
Jeffrey Scott Vitter
-
Algorithms Illuminated Part 1: The Basics
Download
Tim Roughgarden
-
Introduction to the Theory of Computation
Download
Michael Sipser
-
The Little Schemer
Download
Matthias Felleisen
-
Decision Making in Medicine: An Algorithmic Approach
Download
Stuart B. Mushlin MD
-
Python Algorithms: Mastering Basic Algorithms in the Python Language
Download
Magnus Lie Hetland
-
Learning Algorithms Through Programming and Puzzle Solving
Download
Alexander S. Kulikov
-
Foundations of Algorithms
Download
Richard E. Neapolitan
-
Think Data Structures: Algorithms and Information Retrieval in Java
Download
Allen B. Downey