The study of algorithms is motivated by the human desire for answers to questions. We prefer answers that are true and are arrived at by objective methods.

The reasons for these preferences are practical. Our actions are more likely to

have satisfactory results if we understand the world as it is, so we seek truth.

It is easier to secure the agreement and cooperation of others towards common

goals if they can confirm our answers independently, so in seeking truth we use

methods that will give the same results when applied by others.

Sometimes we find that many questions have a similar structure and are

instances of the same problem. Then there may be a single method by which

one can answer any question of the common form. That is what an algorithm

is: an effective, step-by-step, computational method for obtaining an answer to

any instance of a general, formally stated problem.

Before computers, enacting an algorithm that required a large number of

steps, or one that operated on a large quantity of data, called for extraordinary

patience, care, and tenacity. Even so, the results were often flawed. For instance,

from 1853 to 1873, a dedicated hobbyist named William Shanks devoted much

of his free time to the computation of a high-precision value for π, obtaining

707 digits after the decimal point. This stood as a sort of record for a single

computation until 1944, when it was discovered that Shanks had made a mistake

that affected the 528th and all subsequent digits. The largest collection of

related computations ever performed without mechanical aids was probably the

1880 census of the United States, which took seven or eight years to complete

and was almost certainly riddled with incorrect results.

The invention and development of electronic stored-program computers

largely eliminated these constraints on the complexity of computations. We

can now compute 707 digits of π in a fraction of a second, while the 1880 census

computations might take a minute. A long computation today would be something like computing five hundred billion digits of π, and we would expect the

result to be completely correct. A large data set today might be measured in

terabytes.

Another impediment to progress in the creation and use of algorithms, until the middle of the twentieth century, was the absence of any unambiguous

general-purpose notation for recording them. When an algorithm is described in ordinary prose, it is often difficult to figure out exactly how to enact it. Such

descriptions often omit details, fail to explain how to deal with exceptional cases,

or require the performer to guess how to proceed at key points. For instance, if

you learned the pencil-and-paper method for long division (in which the divisor

has more than one digit), you may recall being obliged to estimate the next

digit of the quotient and having to backtrack and revise if your initial guess was

wrong.

The invention and development of high-level programming languages have

largely removed this obstacle as well. It is now commonplace for the creator of

an algorithm to express it exactly, completely, and unambiguously in the form

of a computer program.

However, a human reader, encountering an algorithm for the first time, may

have trouble recognizing the underlying structure of an algorithm in the source

code for the computer program that enacts it. One of the difficulties for such

a reader is that many high-level programming languages embody and enforce a

model of computation in which programs work by repeatedly altering the state

of an appropriately initialized storage device. We often find it difficult to grasp

the interacting and cumulative effects of these alterations.

The solution to this difficulty is to use a different model of computation.

In a pure functional programming language, one thinks of a computation as

the application of a mathematical function to argument values, yielding result

values. If the computation is long and intricate, it is convenient to define the

mathematical function in terms of other, simpler functions, which in turn may

be defined in terms of others that are simpler still. At each level, however, the

functions are stateless, yielding the same results when given the same arguments.

Once we have learned the rule by which a function derives its results from its

arguments, we can treat the function as a reliable component with fixed and

predictable behavior. This modularity makes it easier to design large programs,

easier to construct them, and easier to reason about them.

In addition, functional programming languages make it easier to identify and

abstract out general patterns of interaction among functions, so that one can

describe and operate on those patterns within the language itself. Functions are

themselves values that can be transmitted as arguments to other functions and

returned as results by such functions. Using higher-order functions also makes

it easier to formulate and understand common algorithms.

The plan of this book is to present a variety of widely used algorithms,

expressing them in a pure functional programming language so as to make their

structure and operation clearer to readers. Other advantages of the functional

style will become apparent along the way.

We present the algorithms in a purely functional version of the Scheme programming language, implemented through the (afp primitives) library described in Appendix B. The code, which has been thoroughly tested under the

Chibi-Scheme, Larceny, and Racket implementations of the Revised 7 Report on

the Algorithmic Language Scheme, is available at the author’s Web site

# Algorithms for Functional Programming

Original price : 109 $ Site price : 4 $

Author: | John David Stone |

ISBN: | 9783662579688 |

Publisher: | Springer |

Date: | 2018 |

Pages: | 395 |

Format: |

SKU: 033dce6979b2 Tags: Algorithms, e-book, ebook, Free, Functional, pdf, Programming, technobook, technobook.net

### Related products

$4.00

Original price : 39 $

Site price : 4 $

$2.00

Original price : 24 $

Site price : 2 $

$4.00

Original price : 49 $

Site price : 4 $

$6.00

Original price : 44 $

Site price : 6 $

$2.00

Original price : 29 $

Site price : 2 $

$4.00

Original price : 39 $

Site price : 4 $

$2.00

Original price : 39 $

Site price : 2 $

$2.00

Original price : 14 $

Site price : 2 $