# SICP. Building Abstractions with Procedures

## Table of Contents

## 1. The Elements of Programming

Every powerful language has three mechanisms for combining simple ideas to form more complex ideas:

**primitive expressions**, which represent the simplest entities the language is concerned with;**means of combination**, by which compound elements are built from simple ones;**means of abstraction**, by which compound elements can be named and manipulated as units.

In programming, we deal with two kinds of elements: procedures and data.

### 1.1. Expressions

Expressions within parentheses are called *combinations*. The leftmost element
is called the *operator*, the rest *operands*. Expressions may be *nested*.

### 1.2. Naming and the Environment

A name identifies a *variable* whose *value* is the object. The memory keeping
track of all assigned variables is called the *(global) environment*.

### 1.3. Evaluating Combinations

To evaluate a combination:

Evaluate the subexpressions of the combination;

Apply the procedure with the operator to the operands.

This is *recursive* in nature. Percolating values upward is called *tree
accumulation*.

For primitive cases:

the values of numerals are the numbers that they name;

the values of built-in operators are the machine instruction sequences that carry out the corresponding operations;

the values of other names are the objects associated with those names in the environment.

Exceptions to the general evaluation rule, i.e. `define`

, are called *special forms*.

### 1.4. Compound Procedures

*Procedure definitions* allow for compound operations to be given a name and
then referred to as a unit.

### 1.5. The Substitution Model for Procedure Application

To apply a compound procedure to arguments, evaluate the body of the procedure with each formal parameter replaced by the corresponding argument.

There are two evaluation models possible, *applicative order* (evaluate the
arguments and then apply) and *normal order* (fully expand and then reduce).
Lisp uses applicative-order evaluation, partly because of the additional
efficiency obtained from avoiding multiple evaluations of expressions.

### 1.6. Conditional Expressions and Predicates

A *case analysis* consists of *clauses* with *predicates* and *consequent
expressions*.

**Exercise 1.3**

Define a procedure that takes three numbers as arguments and returns the sum of the square of the two larger numbers.

(define (sum-squares a b) (+ (* a a) (* b b))) (define (f a b c) (cond ((and (<= a b) (<= a c)) (sum-squares b c)) ((and (<= b a) (<= b c)) (sum-squares a c)) (else (sum-squares a b))))

**Exercise 1.5**

(define (p) (p)) (define (test x y) (if (= x 0) 0 y)) (test 0 (p))

What happens when this is evaluated with applicative-order/normal-order?

It runs forever with applicative order. With normal order it immediately terminates with 0.

### 1.7. Example: Square Roots by Newton’s Method

(define (sqrt-iter guess x) (if (good-enough? guess x) guess (sqrt-iter (improve guess x) x))) (define (improve guess x) (average guess (/ x guess))) (define (average x y) (/ (+ x y) 2)) (define (good-enough? guess x) (< (abs (- (square guess) x)) 0.001)) (define (sqrt x) (sqrt-iter 1.0 x))

Iteration can be accomplished using no special construct other than the ordinary ability to call a procedure.

**Exercise 1.6**

(define (new-if predicate then-clause else-clause) (cond (predicate the-clause) (else else-clause))) (define (sqrt-iter guess x) (new-if (good-enough? guess x) guess (sqrt-iter (improve guess x) x)))

What happens?

The default `if`

statement is a special form which means that even when an
interpreter follows applicative substitution, it only evaluates one of its
parameters- not both. However, `new-if`

doesn't have this property, so it never
stops calling itself due to the third parameter passed to it in `sqrt-iter`

.

**Exercise 1.8**

Implement Newton’s method for cube roots. \(\dfrac{x/y^2 + 2y}{3}\)

(define (cube x) (* x (* x x))) (define (cubert-iter guess x) (if (good-enough? guess x) guess (cubert-iter (improve guess x) x))) (define (improve guess x) (/ (+ (/ x (* guess guess)) (* 2 guess)) 3)) (define (good-enough? guess x) (< (abs (- (cube guess) x)) 0.001)) (define (cubert x) (cubert-iter 1.0 x))

### 1.8. Procedures as Black-Box Abstractions

A variable is *bound* if it is *bound* to a specific value within a
procedure definition. An unbound variable is *free*. The set of expressions for
which a binding defines a name is called the *scope* of that name.

We can use internal definitions to create a *block structure* and clean up the
`sqrt`

code.

(define (sqrt x) (define (good-enough? guess x) (< (abs (- (square guess) x)) 0.001)) (define (improve guess x) (average guess (/ x guess))) (define (sqrt-iter guess x) (if (good-enough? guess x) guess (sqrt-iter (improve guess x) x))) (sqrt-iter 1.0 x))

The code can be cleaned up further with *lexical scoping*.

(define (sqrt x) (define (good-enough? guess) (< (abs (- (square guess) x)) 0.001)) (define (improve guess) (average guess (/ x guess))) (define (sqrt-iter guess) (if (good-enough? guess) guess (sqrt-iter (improve guess)))) (sqrt-iter 1.0))

## 2. Procedures and the Processes They Generate

We will examine some common ‘shapes’ of processes and their time and space consumption.

### 2.1. Linear Recursion and Iteration

Looking at procedures to calculate factorials, two different algorithms produce two different shapes.

(define (factorial n) (if (= n 1) 1 (* n (factorial (- n 1)))))

(define (factorial n) (define (iter product counter) (if (> counter n) product (iter (* counter product) (+ counter 1)))) (iter 1 1))

The first grows in memory, the second is shorter and doesn’t grow in memory.
The expansion of the first occurs as the process builds up a chain of *deferred
operations*. This type of process, characterized by a chain of deffered
operations, is called a *recursive process*. A process which grows linearly
with *n* to keep track of the necessary information is called a *linear
recursive process*.

The second process does not grow and shrink; it is called an *iterative
process*. An *iterative process*’s state can be summarized by a fixed number
of *state variables*. In computing \(n!\), the number of steps required grows
linearly with \(n\). Such a process is called a *linear iterative process*.

Note, a recursive procedure simply one which refers to itself (a syntactical description); a recursive process is a process which follows a pattern that is recursive (a description of how it runs).

A language which executes an iterative process in constant space, even if the
iterative process is described by a recursive procedure, is called
*tail-recursive*.