Free Essay

Scaling of Symmetric Rank One Method in Solving Unconstrained Optimization Problem

In:

Submitted By nemoz3122
Words 2895
Pages 12
CHAPTER 1
INTRODUCTION

Background Of the Study

In mathematics, optimization problem is a problem where it consists of maximizing or minimizing a real function by systematically choose an input values within an allowed set and compute the value of the function. An additional, it also means solve the problem so that we can the goal as quickly as possible without wasting a lot of resources.

Optimization also can be deviating from a target by the smallest possible margin. Generally, a large area of applied mathematics is comprised by the optimization theory and techniques to other formulations. In the simple case, optimization is like finding a good value or a best available value of some problems given a defined domain, including a many of different types of objectives functions and different types of domains.

In vector calculus, the gradient of a scalar field is a vector field that points in the direction of the scalar field, and whose the magnitude is that rate of increase. The variation in space of any quantity can be represented by a slope in simple terms. The gradient is like represents the steepness and the direction of the slope. The gradient or gradient of a scalar function f〖:R〗^n→R^1 is denoted by ∇f or ∇ ⃗f where ∇ denotes the vector of the differential operator.

Hessian matrix was developed in the 19th century by the German mathematician Ludwig Otto Hesse and this matrix is later named after him. Hessian matrix is the matrix of second derivatives of a multivariate function. In mathematics it means the gradient of the gradient of a function. Hessian matrix is relevant in many places such as in economy too.

Let a real-valued function f〖:R〗^n→R^1 be given and if all the second partial derivatives of f exist and are continuous over the domain of the function, then the Hessian matrix of f is

H(f)_ij (x)=D_i D_j f(x)=(δ^2 f(x))/(δx(δx))

where x=(x_1,x_2,…,x_n ) and D_i is the differentiation operator with the respect to the ith argument. Hence

H(f)(x)=∇^2 f(x)=[■((∂^2 f)/(〖∂x〗_1^2 )&(∂^2 f)/(∂x_1 ∂x_2 )&…@(∂^2 f)/(∂x_2 ∂x_1 )&(∂^2 f)/(〖∂x〗_2^2 )&…@■(⋮@(∂^2 f)/(∂x_n ∂x_1 ))&■(⋮@(∂^2 f)/(∂x_n ∂x_2 ))&■(⋱@…)) ■((∂^2 f)/(∂x_1 ∂x_n )@(∂^2 f)/(∂x_2 ∂x_n )@■(⋮@(∂^2 f)/(〖∂x〗_n^2 )))] Second derivative test is a test to determine if a given critical point of a real function of one variable is a local maximum or minimum. The real function is twice differentiable at a critical point, x:

Theorem: Sufficient Conditions for an Extremum.

Let f be twice differentiable, then

If f"(x) < 0, then f has a local maximum at x. If f"(x) > 0, then f has a local minimum at x. If f"(x) = 0, the test is inconclusive.

For n-dimensional cases, we have the following criteria:

If ∇^2 f(x) is positive definite, then f has a local minimum at x. If ∇^2 f(x) is negative definite, then f has a local maximum at x. If ∇^2 f(x) is non-definite, then the test is inconclusive.

Objective Of the Study

In this study, we wish to investigate the effect of scaling within the symmetric rank one for unconstrained optimization.
The specific objectives are as follow: To derive some scaling for the symmetric rank one update in preserving the positive definiteness of the updating matrix. To compare the performance of symmetric rank one method with various choices on the scaling.

Outline Of the Report

This report is divided into 5 chapters. Chapter 1 consist the background of the study which includes the introduction of the optimization, gradient and the Hessian matrix. We also state the objectives of this study and the overview of the report.

Chapter 2 gives a brief review of literature on Newton method, quasi-Newton method and symmetric rank one method.

We will describe about the methodology or ways to conducts this paper in Chapter 3. We will discuss about the method that are considered in solving the test problems.
In additional, the algorithms of the scaling of the symmetric rank one method will be provided.

Next, in Chapter 4, we will attempt to show the calculations and results of the few test problems. We will make a comparison between the various scaling that we choose and at the end will discuss whether which scaling is more effective to use in solving the unconstrained optimization.

Finally we conclude the results and findings of this study in Chapter 5. In addition, we will recommend some areas for further research that might play an important role in future study.

CHAPTER 2
LITERATURE REVIEW

Newton method also can be called as the Newton-Raphson method or Newton Iteration. It is a root finding algorithm. But somehow Newton method has its own drawbacks. There is the Newton method is an expensive method which mean it will cost a lot if using this method to solve some problem.

In Newton method, it assumes that the function can be locally approximated as a quadratic in a region around the optimum and the method uses the first and the second derivatives to find out the stationary point. Newton method uses the gradient and the Hessian matrix of the second derivatives of the function to be minimized in the higher dimension.

The main purpose of quasi-Newton method is to find the local maxima and minima of the functions or problems. Quasi-Newton method is based on the Newton method to find the stationary point of the function where the gradient will be zero. Stationary point is an input to a function where the derivative is zero in mathematics.

Today, quasi-Newton method is recognized as one of the most efficient ways to solve the nonlinear unconstrained or bound constrained optimization problems. This quasi-Newton method is mostly used when the second derivative matrix of the objective function is either unavailable or too costly to compute. Actually, they are very similar to Newton method but it avoiding the need of computing the Hessian matrices by recurring, from the iteration to the iteration, a symmetric matrix which can be considered as an approximation of the Hessian.

But the Hessian matrix does not need to be computed in quasi-Newton method because the Hessian is updated. It was updated by analyzing successive gradients vectors instead. Furthermore, the quasi-Newton method is a generalization of the secant method to find the root of the first derivative for the multidimensional problems.

In the multi-dimension problem, the secant equation is cannot be determined. Not only have that, the quasi-Newton method also different in how they constrain the solution by just typically adding a simple low rank updated to the current estimate of the Hessian.

W.C. Davidon (1959) is a physicist who works in the Argonne National Laboratory and he developed the first quasi-Newton method in 1959 which is the updating Davidon-Fletcher-Powell formula (Davidon, 1970) but it is rarely to be used nowadays. The most common one now is symmetric rank one method (SR1).

The SR1 method is actually a quasi-Newton method which to update the second derivative of Hessian matrix based on the derivatives calculated at two points. Next, SR1 is a generalization to the secant method for a multidimensional problem. Secant method is a way of root finding algorithm that uses the succession of roots of secant lines to better approximate the root of a function. If compared to the secant method, the SR1 method is much simpler and may require less computation per iteration when unfactored forms of methods are used.

Symmetric rank one method update the Hessian approximation and it update rule is known to have a good numerical performance. So, the update of the symmetric rank one maintains the symmetric of the matrix but however it does not guarantee the update will be positive definite. In generally, an n x n complex matrix M is said to be positive definite if z*Mz is real and positive for all non-zero complex vectors z.

Symmetric rank one method generated the sequence of Hessian approximations under mild conditions. In another meaning is symmetric rank one generated the approximate Hessians to show faster progress towards the true Hessian if comparing with doing those popular alternatives method like David-Fletcher-Powell formula (DFP) (Davidon, 1970) or Broyden-Fletcher-Goldfarb-Shanno formula (BFGS) (Shanno, 1970) in preliminary numerical experiments.

Quasi-Newton algorithm for the unconstrained nonlinear minimization generates a sequence of matrices that can be considered as approximation of the objective function second derivatives. Conn et al. (1991) proposed and give the conditions under which these approximations can be proved to converge globally to the true Hessian matrix. In the case where the symmetric rank one update formula is used. They examined and proved the rate of convergence and can improving with the rate of convergence of the underlying iterate. Numerical experiments confirmed the theory and the experiments also show the convergence of the Hessian approximation to be substantially slower for the other known quasi-Newton formula.

Multi-step quasi-Newton methods for unconstrained optimization used by the Ford and Moghrabi (1997). They showed how an interpolating curve in the variable spaces could be used to derive an appropriate generalization of the Secant Equation normally employed in the construction of quasi-Newton methods. One of the most successful of these multi-step methods employed the current approximation to the Hessian to determine the parameterization of the interpolating curve and, hence the derivatives which are required in the generalized updating formula.

The certain of approximations were found to be necessary in the process in order to reduce the level of computation required which must be repeated in each iteration to acceptable level. However, they showed how a variant of the algorithm which avoids the need for such approximation, may be obtained.

Hassan et al. (2002) proposed a scaling symmetric rank one update method for the unconstrained optimization because the disadvantages of this method is that the symmetric rank one update may not preserve the positive definiteness when starting a point with a positive definite approximation. So they started a simple remedy to this problem by just restart the update with the initial approximation mostly the identity matrix whenever this difficulty arises. Instead of using the identity matrix, they used a positive multiple of the identity matrix however the numerical experience shows that restart with the identity matrix is not a good choice. They used the positive scaling factor because the positive scaling factor is the optimal solution of the measure defined by the problem.

Ting and Sun (2008) proposed a new quasi-Newton pattern search method based on symmetric rank one update for unconstrained optimization. They developed a new robust and quickly convergent pattern search method based on an implementation of Optimal Conditioning Based Self-Scaling symmetric rank one algorithm for unconstrained optimization. The factorization of approximating Hessian matrices had utilized by this method to provide a series of convergent positive bases needed in pattern search process. They performed numerical experiments on some famous optimization test problems to show that the new method performs well and is competitive in comparison with some other derivative-free methods.

A generalized symmetric rank one method is generalized by Modarres et al. (2009). They employed the interpolatory polynomials. In numerical analysis, polynomial interpolation is the interpolation of a given data set by a polynomial. So, they employed it in order to possess more accurate information from more than one previous step.

In their study, the basic idea is that they incorporate the symmetric rank one update within the framework of multi-step methods. Additionally, iterates could be interpolated by a curve in such a way that the consecutive points define the curves. However they preserve the positive definiteness of the symmetric rank one updates a restart procedure is applied, in which they restart the symmetric rank one update by a scale of the identity. They also compared the results with the Broyden-Fletcher-Goldfarb-Shanno formula (BFGS).

Symmetric rank one is one of the competitive formulas among the quasi-Newton methods. Some modified symmetric rank one updates based on the modified secant equations is proposed by Modarres et al. (2009) which use both gradient and function information. They applied a restart procedure to this update in order to prevent the loss of positive definiteness and zero denominators of the new symmetric rank one updates.

To improve the Hessian approximation, three new algorithms are given to improve it with modified secant equations for the symmetric rank one method. In the results they showed clearly that the proposed algorithms are very encouraging and the advantages of the proposed algorithms over the standard symmetric rank one.

For the large-scale optimization, Leong and Hassan (2011) developed a scaled memoryless symmetric rank one method for it. They concerned the memoryless quasi-Newton method which precisely the quasi-Newton method for which the approximation to the inverse of Hessian, at each step, is updated from the identity matrix. Not only that, the search direction can be computed without the storage of the matrices.

The basic idea is to incorporate the symmetric rank one update within the framework of the memoryless quasi-Newton method. Somehow the symmetric rank one update may not preserve positive definiteness even when the updated from a positive definite matrix. Therefore they proposed the memoryless symmetric rank one method which is updated from a positive scaled of the identity, where the scaling factor is derived in such a way that the positive definiteness of the updating matrices are preserved and at the same time improves the conditions of the scaled memoryless symmetric rank one update. Their study showed that the optimally scaled memoryless symmetric rank one method is very encouraging.

Khiyabani and Leong (2012) performed a new symmetric rank one method with restart for solving unconstrained optimization problems. This new method attempts to improve the quality of the symmetric rank one Hessian by employing the scaling of the identity in a certain sense.

Somehow they proposed an updates criterion based on the eigenvalues of the symmetric rank one update to measure the quality due to some iterations these updates might be singular, indefinite or undefined. They stated that the new method is employed only to improve the approximation of the symmetric rank one Hessian. Hence it showed that the numerical results support the theoretical considerations for the usefulness of the criterion and show that the proposed method improves the performance of the Symmetric rank one update substantially.

CHAPTER 3
METHODOLOGY

3.1 Introduction

The chapter describes the detailed procedures of scaling of symmetric rank one method in solving the unconstrained optimization problems.

In this study, we are concerned with the numerical methods for solving the unconstrained nonlinear optimization problem

min⁡〖f(x),〗 (3.1.1)

where the objective function f:.^n→. is a twice continuously differentiable defined in n-dimensional space.

Among the numerous iterative methods in solving this kind of function (3.1.1), quasi-Newton method can be used. The quasi-method requires only the gradient of the objective function to be supplied at each iterate.

The simplest possibility is to have

H_(k+1)=H_k+E_k=H_k+auu^T, (3.1.2)

in which a is a scalar and a symmetric rank one matrix E_k=auu^T y_k=s_k is added into H_k where s_k=x_(k+1)-x_k , y_k=g_(k+1)-g_k and g_k=∇f(x_k ).

If the quasi-Newton equation

H_(k+1) y_k=s_k, (3.1.3)

is to be satisfied, it follows (3.1.2) and (3.1.3) that

H_(k+1) y_k=H_k y_k+auu^T y_k=s_k, (3.1.4)

or

auu^T y_k=s_k-H_k y_k . (3.1.5)

Hence, ▁u is proportional to s_k-H_k y_k. We may also take

▁u=s_k-H_k y_k. (3.1.6)

Then, 〖au〗^T y_k=1, (3.1.7)

or

a=1/(u^T y_k )=1/((s_k-H_k y_k )^T y_k.) (3.1.8)

Thus, the symmetric rank one formula is given by

〖 H〗_(k+1)=H_k+((s_k-H_k y_k ) 〖(s_k-H_k y_k)〗^T)/(〖(s_k-H_k y_k)〗^T y_k ), (3.1.9)

where s_k=x_(k+1)-x_k , y_k=g_(k+1)-g_k and g_k=∇f(x_k ) denotes the gradient vector of f at current iteration point x_k.

In order to ensure the positive definiteness of the symmetric rank one method, various choices of scaling is used where H_k is updated using the following formula

H_k=θH_(k-1)+((s_(k-1)-〖θH〗_(k-1) y_(k-1) ) (s_(k-1)-〖θH〗_(k-1) y_(k-1) )^T)/((s_(k-1)-θH_(k-1) y_(k-1) )^T y_k ), (3.1.10) where θ is chosen as in order to keep the positive definiteness. Note that if (s_(k-1)-θH_(k-1) y_(k-1) )^T y_k>0, then H_k is positive definite if θ>0 and H_(k-1) is positive definite. So, an appropriate choice of θ should be such that

〖(s_k-H_k y_k)〗^T y_k>0, (3.1.11)

s_k^T y_k-θy_k^T H_k y_k>0, (3.1.12)

θ0. (3.2.3)

Then we have

s_k^T y_k-θy_k^T H_k y_k>0, (3.2.4)

θ

Similar Documents

Free Essay

Machine Learning

...sales@mitpress.mit.edu or write to Special Sales Department, The MIT Press, 55 Hayward Street, Cambridge, MA 02142. A This book was set in L TEX by the authors. Printed and bound in the United States of America. Library of Congress Cataloging-in-Publication Data Mohri, Mehryar. Foundations of machine learning / Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar. p. cm. - (Adaptive computation and machine learning series) Includes bibliographical references and index. ISBN 978-0-262-01825-8 (hardcover : alk. paper) 1. Machine learning. 2. Computer algorithms. I. Rostamizadeh, Afshin. II. Talwalkar, Ameet. III. Title. Q325.5.M64 2012 006.3’1-dc23 2012007249 10 9 8 7 6 5 4 3 2 1 Contents Preface 1 Introduction 1.1 Applications and problems . 1.2 Definitions and terminology 1.3 Cross-validation . . . . . . . 1.4 Learning scenarios . . . . . 1.5 Outline . . . . . . . . . . . 2 The 2.1 2.2 2.3 2.4 xi 1 1 3 5 7 8 11 11 17 21 24 24 25 26 27 28 29 33 34 38 41 48 54 55 63...

Words: 137818 - Pages: 552

Free Essay

Nit-Silchar B.Tech Syllabus

...NATIONAL INSTITUTE OF TECHNOLOGY SILCHAR Bachelor of Technology Programmes amï´>r¶ JH$s g§ñWmZ, m¡Úmo{ à VO o pñ Vw dZ m dY r V ‘ ñ Syllabi and Regulations for Undergraduate PROGRAMME OF STUDY (wef 2012 entry batch) Ma {gb Course Structure for B.Tech (4years, 8 Semester Course) Civil Engineering ( to be applicable from 2012 entry batch onwards) Course No CH-1101 /PH-1101 EE-1101 MA-1101 CE-1101 HS-1101 CH-1111 /PH-1111 ME-1111 Course Name Semester-1 Chemistry/Physics Basic Electrical Engineering Mathematics-I Engineering Graphics Communication Skills Chemistry/Physics Laboratory Workshop Physical Training-I NCC/NSO/NSS L 3 3 3 1 3 0 0 0 0 13 T 1 0 1 0 0 0 0 0 0 2 1 1 1 1 0 0 0 0 4 1 1 0 0 0 0 0 0 2 0 0 0 0 P 0 0 0 3 0 2 3 2 2 8 0 0 0 0 0 2 2 2 2 0 0 0 0 0 2 2 2 6 0 0 8 2 C 8 6 8 5 6 2 3 0 0 38 8 8 8 8 6 2 0 0 40 8 8 6 6 6 2 2 2 40 6 6 8 2 Course No EC-1101 CS-1101 MA-1102 ME-1101 PH-1101/ CH-1101 CS-1111 EE-1111 PH-1111/ CH-1111 Course Name Semester-2 Basic Electronics Introduction to Computing Mathematics-II Engineering Mechanics Physics/Chemistry Computing Laboratory Electrical Science Laboratory Physics/Chemistry Laboratory Physical Training –II NCC/NSO/NSS Semester-4 Structural Analysis-I Hydraulics Environmental Engg-I Structural Design-I Managerial Economics Engg. Geology Laboratory Hydraulics Laboratory Physical Training-IV NCC/NSO/NSS Semester-6 Structural Design-II Structural Analysis-III Foundation Engineering Transportation Engineering-II Hydrology &Flood...

Words: 126345 - Pages: 506

Premium Essay

Gsl Scientific Library

...GNU Scientific Library Reference Manual Edition 1.14, for GSL Version 1.14 4 March 2010 Mark Galassi Los Alamos National Laboratory Jim Davies Department of Computer Science, Georgia Institute of Technology James Theiler Astrophysics and Radiation Measurements Group, Los Alamos National Laboratory Brian Gough Network Theory Limited Gerard Jungman Theoretical Astrophysics Group, Los Alamos National Laboratory Patrick Alken Department of Physics, University of Colorado at Boulder Michael Booth Department of Physics and Astronomy, The Johns Hopkins University Fabrice Rossi University of Paris-Dauphine Copyright c 1996, 1997, 1998, 1999, 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010 The GSL Team. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.3 or any later version published by the Free Software Foundation; with the Invariant Sections being “GNU General Public License” and “Free Software Needs Free Documentation”, the Front-Cover text being “A GNU Manual”, and with the Back-Cover Text being (a) (see below). A copy of the license is included in the section entitled “GNU Free Documentation License”. (a) The Back-Cover Text is: “You have the freedom to copy and modify this GNU Manual.” Printed copies of this manual can be purchased from Network Theory Ltd at http://www.network-theory.co.uk/gsl/manual/. The money raised from sales of the manual...

Words: 148402 - Pages: 594

Free Essay

Face Recognition

...medium, so long as the original work is properly cited. After this work has been published by InTech, authors have the right to republish it, in whole or part, in any publication of which they are the author, and to make other personal use of the work. Any republication, referencing or personal use of the work must explicitly identify the original source. Statements and opinions expressed in the chapters are these of the individual contributors and not necessarily those of the editors or publisher. No responsibility is accepted for the accuracy of information contained in the published articles. The publisher assumes no responsibility for any damage or injury to persons or property arising out of the use of any materials, instructions, methods or ideas contained in the book. Publishing Process Manager Mirna Cvijic Technical Editor Teodora Smiljanic Cover Designer Jan Hyrat Image Copyright hfng, 2010. Used under license from Shutterstock.com First published July, 2011 Printed in Croatia A free online edition of this book is available at www.intechopen.com Additional hard copies can be obtained from orders@intechweb.org Reviews, Refinements and New Ideas in Face Recognition, Edited by Peter M. Corcoran p. cm. ISBN 978-953-307-368-2 free online editions of InTech Books and Journals can be found at www.intechopen.com Contents Preface IX Part 1 Chapter 1 Statistical Face Models & Classifiers A Review of Hidden...

Words: 33246 - Pages: 133

Free Essay

Advanced Algorithms

... Preface Although this may seem a paradox, all exact science is dominated by the idea of approximation. Bertrand Russell (1872–1970) Most natural optimization problems, including those arising in important application areas, are NP-hard. Therefore, under the widely believed conjecture that P = NP, their exact solution is prohibitively time consuming. Charting the landscape of approximability of these problems, via polynomial time algorithms, therefore becomes a compelling subject of scientific inquiry in computer science and mathematics. This book presents the theory of approximation algorithms as it stands today. It is reasonable to expect the picture to change with time. The book is divided into three parts. In Part I we cover a combinatorial algorithms for a number of important problems, using a wide variety of algorithm design techniques. The latter may give Part I a non-cohesive appearance. However, this is to be expected – nature is very rich, and we cannot expect a few tricks to help solve the diverse collection of NP-hard problems. Indeed, in this part, we have purposely refrained from tightly categorizing algorithmic techniques so as not to trivialize matters. Instead, we have attempted to capture, as accurately as possible, the individual character of each problem, and point out connections between problems and algorithms for solving them. In Part II, we present linear programming based algorithms. These are categorized under two fundamental techniques: rounding and the primal–...

Words: 140657 - Pages: 563

Premium Essay

Electrical Electronics

...UNIVERSITY OF KERALA B. TECH DEGREE COURSE 2008 SCHEME ELECTRICAL AND ELECTRONICS ENGINEERING I to VIII SEMESTER SCHEME AND SYLLABUS BOARD OF STUDIES IN ENGINEERING AND FACULTY OF ENGINEERING AND TECHNOLOGY UNIVERSITY OF KERALA B.Tech Degree Course – 2008 Scheme REGULATIONS 1. Conditions for Admission Candidates for admission to the B.Tech degree course shall be required to have passed the Higher Secondary Examination, Kerala or 12th Standard V.H.S.E., C.B.S.E., I.S.C. or any examination accepted by the university as equivalent thereto obtaining not less than 50% in Mathematics and 50% in Mathematics, Physics and Chemistry/ Bio- technology/ Computer Science/ Biology put together, or a diploma in Engineering awarded by the Board of Technical Education, Kerala or an examination recognized as equivalent thereto after undergoing an institutional course of at least three years securing a minimum of 50 % marks in the final diploma examination subject to the usual concessions allowed for backward classes and other communities as specified from time to time. 2. Duration of the course i) The course for the B.Tech Degree shall extend over a period of four academic years comprising of eight semesters. The first and second semester shall be combined and each semester from third semester onwards shall cover the groups of subjects as given in the curriculum and scheme of examination ii) Each semester shall ordinarily comprise of not less than 400 working periods each of 60 minutes...

Words: 36386 - Pages: 146

Premium Essay

Econometrics

...A Guide to Modern Econometrics 2nd edition Marno Verbeek Erasmus University Rotterdam A Guide to Modern Econometrics A Guide to Modern Econometrics 2nd edition Marno Verbeek Erasmus University Rotterdam Copyright  2004 John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex PO19 8SQ, England Telephone (+44) 1243 779777 Email (for orders and customer service enquiries): cs-books@wiley.co.uk Visit our Home Page on www.wileyeurope.com or www.wiley.com All Rights Reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except under the terms of the Copyright, Designs and Patents Act 1988 or under the terms of a licence issued by the Copyright Licensing Agency Ltd, 90 Tottenham Court Road, London W1T 4LP, UK, without the permission in writing of the Publisher. Requests to the Publisher should be addressed to the Permissions Department, John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex PO19 8SQ, England, or emailed to permreq@wiley.co.uk, or faxed to (+44) 1243 770620. This publication is designed to provide accurate and authoritative information in regard to the subject matter covered. It is sold on the understanding that the Publisher is not engaged in rendering professional services. If professional advice or other expert assistance is required,...

Words: 194599 - Pages: 779

Free Essay

Computer Vision

...Learning OpenCV Gary Bradski and Adrian Kaehler Beijing · Cambridge · Farnham · Köln · Sebastopol · Taipei · Tokyo Learning OpenCV by Gary Bradski and Adrian Kaehler Copyright © 2008 Gary Bradski and Adrian Kaehler. All rights reserved. Printed in the United States of America. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (safari.oreilly.com). For more information, contact our corporate/institutional sales department: (800) 998-9938 or corporate@oreilly.com. Editor: Mike Loukides Production Editor: Rachel Monaghan Production Services: Newgen Publishing and Data Services Cover Designer: Karen Montgomery Interior Designer: David Futato Illustrator: Robert Romano Printing History: September 2008: First Edition. Nutshell Handbook, the Nutshell Handbook logo, and the O’Reilly logo are registered trademarks of O’Reilly Media, Inc. Learning OpenCV, the image of a giant peacock moth, and related trade dress are trademarks of O’Reilly Media, Inc. Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks. Where those designations appear in this book, and O’Reilly Media, Inc. was aware of a trademark claim, the designations have been printed in caps or initial caps. While every precaution has been taken in the preparation of this...

Words: 150684 - Pages: 603

Free Essay

Student

...CONCEPTS OF PROGRAMMING LANGUAGES TENTH EDITION This page intentionally left blank CONCEPTS OF PROGRAMMING LANGUAGES TENTH EDITION R OB E RT W. S EB ES TA University of Colorado at Colorado Springs Boston Columbus Indianapolis New York San Francisco Upper Saddle River Amsterdam Cape Town Dubai London Madrid Milan Munich Paris Montreal Toronto Delhi Mexico City Sao Paulo Sydney Hong Kong Seoul Singapore Taipei Tokyo Vice President and Editorial Director, ECS: Marcia Horton Editor in Chief: Michael Hirsch Executive Editor: Matt Goldstein Editorial Assistant: Chelsea Kharakozova Vice President Marketing: Patrice Jones Marketing Manager: Yez Alayan Marketing Coordinator: Kathryn Ferranti Marketing Assistant: Emma Snider Vice President and Director of Production: Vince O’Brien Managing Editor: Jeff Holcomb Senior Production Project Manager: Marilyn Lloyd Manufacturing Manager: Nick Sklitsis Operations Specialist: Lisa McDowell Cover Designer: Anthony Gemmellaro Text Designer: Gillian Hall Cover Image: Mountain near Pisac, Peru; Photo by author Media Editor: Dan Sandin Full-Service Vendor: Laserwords Project Management: Gillian Hall Printer/Binder: Courier Westford Cover Printer: Lehigh-Phoenix Color This book was composed in InDesign. Basal font is Janson Text. Display font is ITC Franklin Gothic. Copyright © 2012, 2010, 2008, 2006, 2004 by Pearson Education, Inc., publishing as Addison-Wesley. All rights reserved. Manufactured...

Words: 142312 - Pages: 570

Free Essay

Concepts of Programming Languages

...CONCEPTS OF PROGRAMMING LANGUAGES TENTH EDITION This page intentionally left blank CONCEPTS OF PROGRAMMING LANGUAGES TENTH EDITION R O B E RT W. S EB ES TA University of Colorado at Colorado Springs Boston Columbus Indianapolis New York San Francisco Upper Saddle River Amsterdam Cape Town Dubai London Madrid Milan Munich Paris Montreal Toronto Delhi Mexico City Sao Paulo Sydney Hong Kong Seoul Singapore Taipei Tokyo Vice President and Editorial Director, ECS: Marcia Horton Editor in Chief: Michael Hirsch Executive Editor: Matt Goldstein Editorial Assistant: Chelsea Kharakozova Vice President Marketing: Patrice Jones Marketing Manager: Yez Alayan Marketing Coordinator: Kathryn Ferranti Marketing Assistant: Emma Snider Vice President and Director of Production: Vince O’Brien Managing Editor: Jeff Holcomb Senior Production Project Manager: Marilyn Lloyd Manufacturing Manager: Nick Sklitsis Operations Specialist: Lisa McDowell Cover Designer: Anthony Gemmellaro Text Designer: Gillian Hall Cover Image: Mountain near Pisac, Peru; Photo by author Media Editor: Dan Sandin Full-Service Vendor: Laserwords Project Management: Gillian Hall Printer/Binder: Courier Westford Cover Printer: Lehigh-Phoenix Color This book was composed in InDesign. Basal font is Janson Text. Display font is ITC Franklin Gothic. Copyright © 2012, 2010, 2008, 2006, 2004 by Pearson Education, Inc., publishing as Addison-Wesley. All rights reserved. Manufactured in the United States...

Words: 142253 - Pages: 570

Free Essay

Networks

...The Wealth of Networks The Wealth of Networks How Social Production Transforms Markets and Freedom Yochai Benkler Yale University Press New Haven and London Copyright _ 2006 by Yochai Benkler. All rights reserved. Subject to the exception immediately following, this book may not be reproduced, in whole or in part, including illustrations, in any form (beyond that copying permitted by Sections 107 and 108 of the U.S. Copyright Law and except by reviewers for the public press), without written permission from the publishers. The author has made an online version of the book available under a Creative Commons Noncommercial Sharealike license; it can be accessed through the author’s website at http://www.benkler.org. Printed in the United States of America. Library of Congress Cataloging-in-Publication Data Benkler, Yochai. The wealth of networks : how social production transforms markets and freedom / Yochai Benkler. p. cm. Includes bibliographical references and index. ISBN-13: 978-0-300-11056-2 (alk. paper) ISBN-10: 0-300-11056-1 (alk. paper) 1. Information society. 2. Information networks. 3. Computer networks—Social aspects. 4. Computer networks—Economic aspects. I. Title. HM851.B457 2006 303.48'33—dc22 2005028316 A catalogue record for this book is available from the British Library. The paper in this book meets the guidelines for permanence and durability of the Committee on Production Guidelines for Book Longevity of the Council on Library Resources. 10 9 8 7 6 5 4 3 2 1...

Words: 214717 - Pages: 859