- Contents
- Solutions Peter Linz Automata
- Solution Formal Languages and Automata by Peter Linz
- Peter Linz - Solutions - MIT Second Year, - Section A
- Solution Manual of Introduction to Finite Automata by Peter Linz

An introduction to formal languages and automata / Peter Linz. . The study of the theory of computation has several purposes, most importantly (1) to familiarize. Introduction to Automata Theory, Languages and. Computation. Addison-Wesley, Peter Linz. An introduction to formal languages and. The Sixth Edition of An Introduction to Formal Languages and Automata provides an of all material essential to an introductory Theory of Computation course. Written The author, Peter Linz, continues to offer a straightforward, uncomplicated Appendix B: Jflap: A Useful Tool · Answers Solutions and Hints for Selected.

Author: | MAYNARD LEESON |

Language: | English, Spanish, Hindi |

Country: | Iran |

Genre: | Art |

Pages: | 405 |

Published (Last): | 15.05.2016 |

ISBN: | 186-2-77650-830-8 |

Distribution: | Free* [*Register to download] |

Uploaded by: | COLLEN |

Automata Theory (COT ). Tuesday and Thursday, , LIF Textbook: Peter Linz， An introduction to formal languages and automata, Third. grammar and Finite state automata - Context free grammars - Normal forms - uvwxy theorem Peter Linz, "An Introduction to Formal. Language and M. Sipser; Introduction to the Theory of. Computation Problems and Solutions. 3. 5 . An Introduction to Formal Languages and Automata -- peter noititsojunchawk.tk of computing ebook download, theory of computing, theory of computing pdf, peter linz.

Peter Linz, University of California, Davis to the important models of finite automata, grammars, and Turing. Instructors manual-peter-linz Solution 9 2. It also intro-duces finite automata in an informalw ay. Yes there. You can find it by doing a simple internet search.

You can find it by doing a simple internet search. Or do you need a solution manual for that as well? Thumbs up. In the new Fifth Edition, Peter Linz continues to offer a straightforward, uncomplicated formal.

Introduction to automata theory, languages and computation J. Hopcroft , R. Motwani theory and formal languages Peter Linz 4 Switching and Finite automata theory Kohavi Test Banks and Solution Manuals for Textbooks students and instructors with high-quality and valuable solutions manual and test bank -… scribd.

Any student with a basic understanding should be able to handle them. They are not always very interesting, but they test the students grasp of the material, uncover possible misunderstandings, and give everyone the satisfaction of being able to do something. A second type of exercise in the manual, I call ll-in-the-details.

These are usually omitted parts of proofs or examples whose broad outlines are sketched in the text. Most of them are not overly dicult since all the non-obvious points have been spelled out. For mathematically well-trained students these exercises tend to be simple, but for those not in this cat- egory e. They are useful primarily in sharpening mathematical reasoning and formalizing skills. The prevalent and most satisfying type of exercise involves both an understanding of the material and an ability to carry it a step further.

These exercises are a little like puzzles whose solution involves inventiveness, ranging from the fairly easy to the very challenging. Some of the more dicult ones require tricks that are not easy to discover, so an occasional hint may be in order. I have identied some of the harder problems with a star, but this classication is highly subjective and may not be shared by others. Although these are not easy problems, I suggest you assign at least one or two of them.

Examining the given solution of 18 should help. You may want to assign this problem just so that students will read the given answer. Extend the construction in Exercise 14, Section 2.

The idea is the same here, but there are a few more details. Suppose the graph for L looks like a b. The idea here is similar to Exercise We replace each part of the graph of the form a m a 1. Note that this works only if we start with a dfa! Simple view of closure via grammars. The ideas are intuitively easy to see, but the arguments should be done carefully.

But as demon- strated with several examples in previous sections, set operations together with known algorithms are often a quicker way to the solution. Construct a dfa. Construct the regular language L 1 L 2 e. This is a little harder than some of the above exercises.

Take a dfa for L. By Exercise 22, Section 4. Similar to Exercise A good exercise that involves a simple, but not obvious trick. Some students will come to grief trying to argue from transition graphs. Look at the transition graph for the dfa. If not, check the lengths of all possible paths. The proofs of these are nearly identical to the proof of Theorem 4.

These are theoretical, but worthwhile exercises; they make the student think about what is involved in the proof of the pumping lemma. The results are useful in Exercise 20 below. A simple problem: This set is generally easy and involves little more than a routine appli- cation of the pumping lemma, similar to Examples 4.

A simpler solution is by contradiction; if L is assumed regular, so is L. But we already know that, in this particular case, L is not regular. I think that all parts of this exercise should be done before proceeding to harder problems. This set is harder than Exercise 4, since some algebra is required. Part b follows from a , since the language here is the complement of that in a. Parts c and d require similar algebraic manipulations.

The language in a is regular, but b is not. Applying the pumping lemma to b requires some care. An easy application of the pumping lemma. A hard problem. Part a is very hard. The argument through closure is easy since we have Example 4. Very easy. We can easily pump out this language. Good exercise for students to develop some intuition about what is and what is not a regular language.

The key is how much has to be remembered by the automaton. No, but it is hard to apply the pumping lemma directly. The latter is clearly not regular. Repeat the argument in Example 4. The solution is given.

It seems impossible to pick a string to which Theorem 4. We can do better by applying the extended version of the pumping lemma as stated in Exercises 1 and 2 previously. The result can be pumped out of the language, although it takes a fair amount of arguing to convince yourself that this is so. The proof can be done by induction on the number of vertices. The answer is no. Consequently, z is in the union of all the L i. Conversely, take any string z of length m that is in all of the L i.

Very similar in appearance to Exercise 12 in Section 4. Rectangles are described by u n r m d n l m. Apply the pumping lemma to show that the language is not regular.

A problem for students who misunderstand what the pumping lemma is all about. Chapter 5 Context-Free Languages 5. Most of the exercises explore the concepts in a fairly direct way. Many of the exercises are reminiscent of or extensions to the grammar problems in Section 1. Straightforward reasoning: Simple drill exercise. No, by an easy application of the pumping lemma for regular languages.

Fill-in-the-details via an induction on the number of steps in the derivation. Several of the exercises simplify when split into subsets e. Again, they are much easier if split intelligently. The other parts are similar in nature. Conceptually not hard, but the answer is long: An easy exercise, involving two familiar ideas.

An answer: Introduce new variables and rewrite rules so that all terminals are on the left side of the productions. This anticipates some of the grammar manipulations of Chapter 6 and is easier after that material has been covered. This exercise anticipates the closure results of Chapter 8. S 1 , where S 1 derives L. Another simple exercise, anticipating closure under union. A little easier than solving Exercise Two alternatives have to be consid- ered: Routine drill to give the derivation tree, but an intuitive characteriza- tion of the underlying language is quite hard to make.

An easily solved problem. Easy, but there are many answers. This exercise is not hard, but it is worthwhile since it introduces an important idea. You can view this as an example of a metalanguage, that is, a language about languages.

Same idea as in Exercise For a leftmost derivation, traverse tree in preorder, that is, process root, process left subtree, process right subtree, recursively, expanding each variable in turn.

The fact that we can do this shows that a leftmost derivation is always possible. Expands on discussion in Example 5. A fairly simple exercise involving some counting of leaves in a tree.

The major purpose is to get students to think about derivation trees. To get the answer, establish a relation between the height of the derivation tree and the maximum number of leaves.

For a tree of height h, the maximum number is h k. The other extreme is when there is only one node i. Clearly, there is no choice in a leftmost derivation. A simple, solved problem. Construct a dfa and from it a regular grammar.

Since the dfa never has a choice, there will never be a choice in the productions. In fact, the resulting grammar is an s-grammar. Follow the approach used for arithmetic expressions.

But, by comparing the sentential form with the string to be parsed, we see that there is never a choice in what production to apply, so the grammar is unambiguous. Simple variations on the current theme. So every two productions produce two terminals, and we have at most [w[ rounds. Consider leftmost productions. Since the variable to be ex- panded occurs on the left side of only one production, there is never a choice. The main purpose of this type of exercise is to remind students of potential applications.

Assign one or two of these if you feel that such a reminder is necessary. The exercises are useful to reinforce the constructions. Without them, the students may be left with some misunderstandings. A reminder of derivation trees. Simple, just substitute for B. A routine application of the algorithm in Theorem 6. The grammar generates the empty set. Straightforward, but shows that order matters.

Routing applications of the given theorems. Involve elementary arguments to complete some missing detail. This leads into a subsequent exercise, where the particular observation is made general. This generalizes the previous two exercises. To prove it, show that every nonempty string in L G can still be derived. Here we add no productions at all. An argument, which is at least plausible, can be made from the derivation tree.

Since the tree does not embody any order, the order of replacement of variables cannot matter. An important point: B will be recognized as useless, but not A. Exercises 21 and 22 are intimidating in appearance, but really quite simple, once the notation is understood.

I include them, because students sometimes ask: Looks harder than it is. Obviously, in removing useless variables we introduce no new productions; therefore, the complexity decreases. Serves to point out that just removing useless variables does not simplify the grammar as far as possible. The proof can be done by showing that crucial sentential forms generated by one grammar can also be generated by the other one.

This is actually an important result, dealing with the removal of certain left-recursive productions from the grammar and is needed if one were to discuss the general algorithm for converting a grammar into Greibach normal form. A routine application illustrating the result of Exercise The arguments are similar to those in Exercise Straightforward even though a bit theoretical.

By now the stu- dents should have no trouble with this sort of thing. Routine drills, involving an easy application of Theorem 6. To apply the method described in Theorem 6. The rest is easy, but a little lengthy. An exercise that looks harder than it is.

It can be solved with elementary arguments. Enumerate the productions generated in each step. The stated result follows easily. A trivial exercise, just to get students to work with the widely used concept of a dependency graph.

A simple solved exercise leading to a normal form for linear gram- mars. Another normal form, which can be obtained from Chomsky nor- mal form. They serve to make it plausible that conver- sion to Greibach normal form is always possible. No, since the result is a regular grammar.

This exercise shows an important extension of the material in the text. Start from Greibach normal form, then make substitutions to reduce the number of variables in the productions. Complete arguments can be found in some advanced treatments, for example, in Harrison This exercise may be suitable for an extra-credit assignment for the better students.

The exercises in this section can be used to complement what might be a very short discussion in class. Exercises 1 and 2 are drill, requiring no more than an understanding of the notation by which the algorithm is described. Exercises 3 and 4 involve an extension and actual implementation.

For this, the student will have to have a good understanding of the construction. Chapter 7 Pushdown Automata 7. Particularly in- structive are the ones that force the student to think nondeterministically. But once they discover the solution, they begin to understand something about nondeterminism. An even more convincing problem is Exercise 10 in Section 8. No need for state q 1 ; substitute q 0 wherever q 1 occurs. This problem is not too hard, but requires a little bit of analysis of the nondeterministic nature of the pda in Example 7.

For this reason, it is a helpful exercise. An argument goes somewhat like this: But if it is not made in the middle of the string, the emergence of the stack start symbol will not coincide with the end of the input. The only way the stack can be cleared is to make the transition in the middle of the input string.

One of these problems may be useful in illustrating this point. A set of exercises in programming a pda. Most students will have some experience in programming with stacks, so this is not too far removed from their experience and consequently tends to be easy.

Those problems, such as f and j , that illustrate nondeterminism may be a little harder. Part a is easy: Parts b and c are also easy. Part d is a little harder. Put a token on stack for each a. Each b will consume one until the stack start symbol appears. At that time, switch state, so now b puts on tokens to be consumed by c.

In f nondeterminism is the key; an a can put one, two, or three tokens on the stack. Parts g , h , and i are easy, following the approach suggested in Example 7. Part j is similar, but uses nondeterminism.

Students who blindly follow the lead suggested by Example 7. Taking care of this takes a good bit of thought. A solution is below. We can use nondeterminism to decide where this part ends. The reason for the lengthi- ness is that internal states must be used to recognize substrings such as ab as a unit. This substring puts a single symbol on the stack, which is then consumed by the substring ba. These are simple exercises that require an analysis of a given pda. A simple tracing exercise.

Not an easy problem, but the solution is provided. No change. This result is needed in subsequent discussions. In many books this is treated as a major issue. We left it out of our main discussion, but the result is useful.

The construction is not hard to discover. This result is also needed later. A few exercises will make the results plausible, which is all one really needs. Straightforward drill exercise. A relatively easy proof that just formalizes some obvious observa- tions. The long way is to follow the construction in Theorem 7. The grammars should be converted to Greibach normal form, after which the exercises are direct applications of the construction of Theorem 7.

This exercise requires that the student be able to put together two observations made in the text. In Theorem 7. From Theorem 7. Putting the two together proves the claim. This may be quite hard to see and involves a careful examina- tion of the proof of Theorem 7. The state q 1 serves the sole purpose of ensuring that the desired derivation gets started. If we replace q 1 by q 0 everywhere, we change very little, except that Equation 7.

Two similar exercises, one of which is solved. These two exercises illustrate Theorem 7. Since the proof of the theorem is tedious and intricate, you may want to skip it altogether. In that case, these two exercises can be used to make the result more believable. May be worthwhile, since it expands on a comment made in the proof of Theorem 7. Routine use of a given construction. Two pre- vious exercises Exercises 16 and 17 in Section 7.

This is a worthwhile exercise in algorithm construction. The basic process is sketched on p. Yes, there are still useless variables. If all useless productions are removed, we get a grammar with only six productions. A little harder, with a given solution.

This is fairly easy to see. When the stack is empty, start the whole process again. Somewhat vague, but worthwhile in anticipation of results in Section 8.

But we have no way of deciding this deterministically. The answers are also hard to grade. The language is deterministic. Special care has to be taken to ensure that ab and aabb are handled correctly. A straightforward exercise. The c makes it obvious where the middle of the string is. This makes a good contrast to Exercise 9.

Since there is no marker, we need to guess where the middle is. When z comes to the top of the stack and we are not in state q f , accept the string. Here the student is asked to work out some details omitted in the text. The arguments should not be too hard.

A dfa can be considered a dpda whose stack is irrelevant. This intuitive reasoning is easy, but if you ask for more detail, the argument tends to become lengthy. These exercises anticipate later results, and both have some nonobvious constructions that can be made along the lines of the intersection construction in Theorem 4.

If Q and P are the state sets for the dfa accepting L 2 and the dpa accepting L 1 , respectively, the control unit has states Q P. The stack is handled as for L 1. The key is that L 2 is regular, so that the combined machine needs only a single stack and is therefore a pda. The exercises here point the way to more complicated issues treated in compilers courses.

A reasonably simple argument to show that the grammar is LL 3. An exploration of some connections that were not explicitly men- tioned in the text. Formal arguments are involved, making this an extra-credit problem for the very good stu- dents. To see if it is LL 2 , substitute for the leftmost variable. Chapter 8 Properties of Context-Free Languages 8. What is new here is that there are generally more decomposition choices, all of which must be addressed if the argument is to be complete.

The most common mistake is that students forget to take everything into account. The argument is essentially the same as for regular languages. See the solution of Exercise 5 a , Section 4. See solution. This can be pumped out of the language without too much trouble. Follow the suggestions in Examples 8. These will be challenging to many students, but the methods of attack should already be known from previous examples and exer- cises.

Things get easier after you solve one or two of these. This set of examples is a little more interesting than the previous ones, since we are not told to which language family the examples belong. It is important that the students develop some intuitive feeling for this. Before a correct argument can be made, a good conjecture is necessary.

The string must be long enough so that some variable repeats in the leftmost posi- tion. This requires [V [ steps. If the length of the right side of any production is not more than N, then in any string of length [V [ N or longer, such a repetition must occur. Show that it is context-free by giving a context- free grammar.

Then apply the pumping lemma for linear lan- guages to show that it is not linear. Choose as the string to be pumped a m b 2m a m. A two-part problem, illustrating existence and non-existence ar- guments.