Handbook of Control Systems Engineering

Handbook of Control Systems Engineering

By (author) 

Free delivery worldwide

Available. Dispatched from the UK in 4 business days
When will my order arrive?


This book is a revision and extension of my 1995 Sourcebook of Control Systems Engineering. Because of the extensions and other modifications, it has been retitled Handbook of Control Systems Engineering, which it is intended to be for its prime audience: advanced undergraduate students, beginning graduate students, and practising engineers needing an understandable review of the field or recent developments which may prove useful. There are several differences between this edition and the first. * Two new chapters on aspects of nonlinear systems have been incorporated. In the first of these, selected material for nonlinear systems is concentrated on four aspects: showing the value of certain linear controllers, arguing the suitability of algebraic linearization, reviewing the semi-classical methods of harmonic balance, and introducing the nonlinear change of variable technique known as feedback linearization. In the second chapter, the topic of variable structure control, often with sliding mode, is introduced. * Another new chapter introduces discrete event systems, including several approaches to their analysis. * The chapters on robust control and intelligent control have been extensively revised. * Modest revisions and extensions have also been made to other chapters, often to incorporate extensions to nonlinear systems.
show more

Product details

  • Hardback | 1063 pages
  • 182.4 x 225 x 50.8mm | 1,315.43g
  • Dordrecht, Netherlands
  • English
  • Revised
  • 2nd ed. 2001
  • XXVI, 1063 p. In 2 volumes, not available separately.
  • 0792374940
  • 9780792374947

Table of contents

Preface. 1. Introduction and overview. 2. Elements of systems engineering of digital control. 3. Sensors and instrumentation. 4. Control elements, actuators, and displays. 5. Computer systems hardware. 6. Computer software. 7. Communications. 8. Control laws without theory. 9. Sources of system models. 10. Continuous-time system representations. 11. Sampled-data system representations. 12. Conversions of Continuous-time to discrete-time models. 13. System Performance indicators. 14. BIBO stability and simple tests. 15. Nyquist stability theory. 16. Lyapunov stability testing. 17. Steady state response: error constants and system type. 18. Root locus methods for analysis and design. 19. Desirable pole locations. 20. Bode diagrams for frequency domain analysis and design. 21. A special control law: deadbeat control. 22. Controllability. 23. Controller design by pole placement. 24. Observability. 25. State observers. 26. Optimal control by multiplier-type methods. 27. Other optimal control methods. 28. State estimation in noise. 29. State feedback using state estimates. 30. System identification. 31. Adaptive and self-tuning control. 32. Structures of multivariable controllers. 33. Linearization methods for nonlinear systems. 34. Variable structures and sliding mode control. 35. Intelligent control. 36. Robus control. 37. Discrete event control systems. Appendix A. Appendix B. Appendix C. References. Index.
show more