hn.academy

Online courses recommended by Hacker News users. [about]

Discrete Optimization

Coursera · The University of Melbourne · 51 HN points · 16 HN citations

Tired of solving Sudokus by hand? This class teaches you how to solve complex search problems with discrete optimization concepts and algorithms, including constraint programming, local ...

View on Coursera
The vast majority of the courses listed here on HN.Academy are available from their providers for free. Many courses offer a completion certification for a fee. A few courses and specializations require an enrollment fee. HN.Academy receives a referral commission when you visit course pages through links on this site and then purchase courses and completion certificates. If you decide to purchase a certificate or course the commission does not increase the cost of the course and helps support the continued existence of HN.Academy which is much appreciated.

Hacker News Comments about Discrete Optimization

All the comments and stories posted to Hacker News that reference this course.
Nov 21, 2018 davidgl on Essentials of Metaheuristics (2015) [pdf]
If you find this area fascinating, and I do, I can't recommend this cousera course enough: https://www.coursera.org/learn/discrete-optimization
Nov 08, 2018 shoo on How We Built a Cost-Based SQL Optimizer
Maybe of interest - postgres planner documentation

https://www.postgresql.org/docs/current/planner-optimizer.ht...

Genetic query optimiser: https://www.postgresql.org/docs/current/geqo.html

postgres has a number of hooks, which can be used to override the default behaviour. In particular, there are a bunch of hooks that can be used to install a custom query planner. https://github.com/AmatanHead/psql-hooks/blob/master/Readme....

more generally, ignoring database query optimisation specifically, if you are interested in learning about discrete optimisation techniques, I recommend this course: https://www.coursera.org/learn/discrete-optimization

Apr 22, 2018 tasseff on Ask HN: How can learn about Linear Programming/Optimization?
I'm a Ph.D. student in operations research (OR). My suggestion would be to first build a strong foundation in linear programming. This will introduce you to the notion of duality, which is heavily emphasized in many mathematical programming courses. Here's a good open source book on linear programming written by Jon Lee, the current editor of Mathematical Programming A: https://github.com/jon77lee/JLee_LinearOptimizationBook

Then I'd suggest studying more general methods for continuous and convex optimization. The book I see mentioned a lot is Convex Optimization by Boyd and Vandenberghe, although we didn't use this in our coursework. Instead, we used a lot of the material presented here: http://mitmgmtfaculty.mit.edu/rfreund/educationalactivities/

If you read the above (or any other two books on linear programming and convex optimization), you'll probably have a better idea of what you want to study next and how you want to go about it. The next natural step would be to study combinatorial (i.e., integer or mixed-integer) optimization. (Jon Lee has another book on this subject; I've also heard good things about the Schrijver book.)

It's also not a bad idea to start getting familiar with the tools people use to solve mathematical programs in practice. The theory you'll learn to solve linear programs is different from the theory for general convex problems, and similarly for mixed-integer programs. As such, there are various "solvers" that are used to solve various problem types. Because of this, there are a lot of frameworks that are used to develop general mathematical programs, which can then interface with these solvers in a generic way. For example, Pyomo is a modeling language based in Python, and JuMP is a modeling language based in Julia.

If you want a general introduction to discrete optimization in a course format, you should take this class on Coursera ( https://www.coursera.org/learn/discrete-optimization ). It's a fantastic introduction to operations research in general, and it will give you a good intuition of the types of problems you encounter in the field as well as the various techniques that are used to solve them. (Full disclosure: Pascal is my adviser.)

Let me know if you have any more questions. I recently became a moderator of /r/OperationsResearch on Reddit and love the idea of having a larger community that can answer questions like yours. Operations research seems somewhat overshadowed by machine learning at the moment (at least in popular culture), but they truly go hand-in-hand. I'm somewhat surprised that a lot of useful techniques from OR are not heavily discussed by machine learning practitioners. Both fields are, fundamentally, trying to solve difficult optimization problems.

Apr 03, 2018 lowbloodsugar on Ask HN: What are the best MOOCs you've taken?
I really enjoyed Discrete Optimization, Pascal Van Hentenryck, Coursera. [1] I did it in 2013 and it looks like it's changed a little: vehicle routing seems a good, practical topic right now. Optimizing systems is one of my favorite pleasures, so this course was great for me.

[1] https://www.coursera.org/learn/discrete-optimization

Apr 03, 2018 cjauvin on Ask HN: What are the best MOOCs you've taken?
Three Coursera MOOCs I particularly enjoyed:

* Discrete Optimization: almost entirely problem-driven, very challenging and entertaining prof; https://www.coursera.org/learn/discrete-optimization

* Crypto I: very deep, thorough and crystal clear explanations; https://www.coursera.org/learn/crypto

* Computer Networks: excellent overall course covering a wide variety of topics; https://www.coursera.org/instructor/~517478 , https://www.youtube.com/playlist?list=PLfgkuLYEOvGMWvHRgFAcj...

Jan 29, 2018 brudgers on Ask HN: How do I study optimization?
Though I did not successfully complete it, this course was very interesting and informative: https://www.coursera.org/learn/discrete-optimization
Jun 07, 2016 codepie on Applied Mathematical Programming (1977)
You can add the optimization course on coursera https://www.coursera.org/course/optimization to your list. I have been waiting for it to start and it seems that it is finally going to start this month ( https://class.coursera.org/optimization-003/forum/thread?thr... ).
Nov 30, 2015 cjauvin on Ask HN: What's your favorite online course?
Discrete Optimization https://www.coursera.org/course/optimization -- I really enjoyed this challenging class with a very dynamic teacher, and organized around a set of tough problems that you can tackle using a choice of optimization paradigms (e.g. you can decide to "specialize" in "local search" if you want, and try to solve eveything with it).
Jul 07, 2015 fridek on Google Optimization Tools
Coursera has a great course on discrete optimization [1] where I have learned and used or-tools. They are rather nice, but the documentation is half done (basically code is the best documentation) and some interfaces are not compatible with others. I ended up forking or-tools for my own use and tweaking many unexposed internals. I guess it's extremely difficult to implement generic optimization solver, so I won't complain, but be prepared it's not out-of-the-box thing (I doubt there is any).

[1] https://www.coursera.org/course/optimization

May 08, 2015 steve_g on Ask HN: Best courses you took?
Discrete Optimization with Pascal Van Hentenryck on Coursera - https://www.coursera.org/course/optimization

It doesn't look like there's any open sessions now.

Dec 09, 2014 osoba on Ask HN: How or where to begin learning mathematics from first principles?
I will try to list resources in a linear fashion, in a way that one naturally adds onto the previous (in terms of knowledge)

[PREREQUISITES]

First things first, I assume you went to a highschool, so you don't have a need for a full pre-calculus course. This would assume you, at least intuitively, understand what a function is; you know what a polynomial is; what rational, imaginary, real and complex numbers are; you can solve any quadratic equation; you know the equation of a line (and of a circle) and you can find the point that intersects two lines; you know the perimiter, area and volume formulas for common geometrical shapes/bodies and you know trigonometry in a context of a triangle. Khan Academy website (or simple googling) is good to fill any gaps in this.

[BASICS]

You would obviously start with calculus. Jim Fowlers Calculus 1 is an excellent first start if you don't know anything about the topic. Calculus: Single Variable https://www.coursera.org/course/calcsing is the more advanced version which I would strongly suggest, as it requires very little prerequisites and goes into some deeper practical issues.

By far the best resource for Linear Algebra is the MIT course taught by Gilbert Strang http://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebr... If you prefer to learn through programming, https://www.coursera.org/course/matrix might be better for you, though this is a somewhat lightweight course.

[SECOND STEP]

After this point you'd might want to review single variable calculus though a more analytical approach on MIT OCW http://ocw.mit.edu/courses/mathematics/18-01sc-single-variab... as well as take your venture into multivariable calculus http://ocw.mit.edu/courses/mathematics/18-02sc-multivariable...

Excellent book for single variable calculus (though in reality its a book in mathematical analysis) is Spivaks "Calculus" (depending on where you are, legally or illegally obtainable here http://libgen.org/ (as are the other books mentioned in this post)). A quick and dirty run through multivariable analysis is Spivaks "Calculus on Manifolds".

Another exellent book (that covers both single and multivar analysis) is Walter Rudins "Principles of Mathematical Analysis" (commonly referred to as "baby rudin" by mathematicians), though be warned, this is an advanced book. The author wont cradle you with superfluous explanations and you may encounter many examples of "magical math" (you are presented with a difficult problem and the solution is a clever idea that somebody magically pulled out of their ass in a strike of pure genius, making you feel like you would have never thought of it yourself and you should probably give up math forever. (Obviously don't, this is common in mathematics. Through time proofs get perfected until they reach a very elegant form, and are only presented that way, obscuring the decades/centuries of work that went into the making of that solution))

At this point you have all the necessery knowledge to start studying Differential Equations http://ocw.mit.edu/courses/mathematics/18-03sc-differential-...

Alternativelly you can go into Probability and Statistics https://www.coursera.org/course/biostats https://www.coursera.org/course/biostats2

[FURTHER MATH]

If you have gone through the above, you already have all the knowledge you need to study the areas you mentioned in your post. However, if you are interested in further mathematics you can go through the following:

The actual first principles of mathematics are prepositional and first order logic. It would, however, (imo) not be natural to start your study of maths with it. Good resource is https://www.coursera.org/course/intrologic and possibly https://class.stanford.edu/courses/Philosophy/LPL/2014/about

For Abstract algebra and Complex analysis (two separate subjects) you could go through Saylors courses http://www.saylor.org/majors/mathematics/ (sorry, I didn't study these in english).

You would also want to find some resource to study Galois theory which would be a nice bridge between algebra and number theory. For number theory I recommend the book by G. H. Hardy

At some point in life you'd also want to go through Partial Differential Equations, and perhaps Numerical Analysis. I guess check them out on Saylor http://www.saylor.org/majors/mathematics/

Topology by Munkres (its a book)

Rudin's Functional Analysis (this is the "big/adult rudin")

Hatcher's Algebraic Topology

[LIFE AFTER MATH]

It is, I guess, natural for mathematicians to branch out into:

[Computer/Data Science]

There are, literally, hundreds of courses on edX, Coursera and Udacity so take your pick. These are some of my favorites:

Artificial Intelligence https://www.edx.org/course/artificial-intelligence-uc-berkel...

Machine Learning https://www.coursera.org/course/ml

The 2+2 Princeton and Stanford Algorithms classes on Coursera

Discrete Optimization https://www.coursera.org/course/optimization

Convex Optimization https://itunes.apple.com/itunes-u/convex-optimization-ee364a... https://itunes.apple.com/us/course/convex-optimization-ii/id...

[Physics]

http://theoreticalminimum.com/courses

http://ocw.mit.edu/courses/physics/

Aug 15, 2014 mgraczyk on The Knapsack Problem
I wrote a solution using this table building DP algorithm a few years ago.

    https://github.com/mgraczyk/DiscreteOptimization/blob/master/knapsack/speed_up/solve_it.cpp
This was part of an assignment for a Coursera Discrete Optimization course created by The University of Melbourne. It's a great course and I recommend it to anybody who wants to hone their understanding of dynamic programming and solving computationally expensive problems.

    https://www.coursera.org/course/optimization

Mar 31, 2014 pedrosorio on Big data: are we making a big mistake?
I think defining "big data" as something that depends on the algorithm you are applying is not very useful. In that sense, almost anything is big data when you are trying to solve an NP-complete problem (nice course about algorithmic approaches to these problems: https://www.coursera.org/course/optimization ).

The problem is, if you define "big data" as something that depends on the algorithm, then it makes no sense to include the word "data" in it. The expression "big data" as it is commonly used refers to flows of data so big that you need specialized approaches even when applying simple transformations to the data.

Since the dawn of computing we've always wanted to solve problems, with small or large amounts of data, that required complex algorithms. The usage of a new expression is justified by the fact that huge flows of data are now available to many companies (mainly because of the web), not because these companies are attempting to perform extremely complicated transformations to the data.

TL;DR: Big data means big volume/flow of data, and not (as you are defining it) using large Big-O complexity algorithms on some set of data. In fact, the size of big data precludes applying large Big-O complexity algorithms to said data.

Feb 26, 2014 shoo on Overkilling the 8-queens Problem
N = 2 ^ 18 = 262144 queens in less than 6 seconds in a VM on a laptop

its in ~210 lines of C++:

https://bitbucket.org/rfc/dopt_nqueen/src/3e812164d97a8e91d4...

uses random local search approach, as described in russell and norvig ( http://aima.cs.berkeley.edu/ ).

wrote this code during https://www.coursera.org/course/optimization

Jan 17, 2014 PierreMage on Yale students made a better version of its course catalog. Yale shut it down
You can take a look at Discrete Optimization course on Coursera: https://www.coursera.org/course/optimization

The next session will start March 4th 2014.

Aug 06, 2013 wcbeard10 on Introduction to Recommender Systems
I believe their Discrete optimization course is at least somewhat self-paced. [1] Also, not sure if all of the Stanford/edX courses are, but Medical Stats is, and it's been really helpful for keeping my attention.

[1] https://www.coursera.org/course/optimization [2] medstats.class.stanford.edu

Jun 18, 2013 corey submitted Discrete Optimization course begins today (51 points, 25 comments)
xtrumanx
According to a staff reply[0] on the forums, your assignments will only submit the output of your programs so you can use any language you'd like in the course. Did anyone do the Scala course that ended last month? This course seems like the perfect place to try out your newly developed functional programming skills.

I've watched the preliminary videos and I've got to say this seems like some really good stuff. If you like programming challenges, you should definitely sign up for this course. All the assignments and video lecturers are available from the start so you can study at your own pace but you need to submit your assignments on time to receive credit.

Also look out for:

- Algorithms: Design and Analysis, Part 1[1]

- Coding the Matrix: Linear Algebra through Computer Science Applications[2]

They're both starting in 12 days on July 1st.

[0, You may need to be enrolled and logged in to view this thread] https://class.coursera.org/optimization-001/forum/thread?thr...

[1]https://www.coursera.org/course/algo

[2]https://www.coursera.org/course/matrix

petercooper
Be sure to watch the promo video at the top. It does nothing to dispel the stereotype that the best academics are borderline crazy.. ;-)
astrec
Pascal is one of the most enthralling lecturers I've ever been... well, lectured by. He's incredibly enthusiastic. I think it comes across in his videos, no? ;)
WildUtah
If you like this kind of thing and have a slightly competitive mindset, you might also like:

http://www.codeforces.com http://www.topcoder.com/tc

snake_plissken
I enrolled and I am very excited. The material looks very interesting, especially since I don't have any formal background in this field and I work in operations research.
corey
How do you like working in OR? Are there a lot of jobs in the field? I'm a math major and like programming, so it seems like it'd be an interesting career for me.
astrec
No lots of jobs, no, but there are jobs. It's an interdisciplinary field, so your math and programming background will hold you in good stead. Some basic stats are a good idea too.

Of course some people are doing OR but don't know they're doing it. Some just use another label. Other useful search terms include "management science", "operational research" (UK), "advanced analytics", and of course "optimi(s||z)ation".

If you're after an introductory text I can recommend Wayne Winston's Operations Research: Applications and Algorithms. 4th edition. ISBN-13: 978-0534380588

antman
From the lecture: "These are the problems we are going to deal with. If you find any efficient solutions for all of them call me. And don't tell ANYONE"

FYI, so far it seems that this course requires lot of hours...

xtrumanx
He mentioned that you may need to invest about 12 hours a week on the assignments but that figure is more accurate regarding the latter assignments as they get harder as you go along.

The first week's assignment does not seem that hard after having watched the lecture but I've yet to submit my assignment.

It's a free course and there's no harm in enrolling and giving it a shot. At very least, you'll be exposed to some new and interesting stuff. At best, you'll master some new and interesting stuff.

To me, this seems like the kind of stuff that'll make me "level-up" as a programmer so I'm pretty excited and enthusiastic right now.

pyoung
Ugghhh, I signed up for this, my only complaint with coursera is there are way too many classes I want to take. Am already taking two classes that are about to wrap up and am probably spending 15-20 hrs/wk on them (lectures included). Hopefully I can manage the load for the next two weeks.
azth
Hehe, same here! So many fantastic courses to choose from, not enough time :)
kenster07
I, for one, am very appreciative of this course, and the professor for taking the time to do it. Doing the first week's assignment, I've already learned a lot about programming for NP-hard problems.
graycat
Not too impressed with the course.

My qualifications: I got started in optimization scheduling the fleet at FedEx. In the words of FedEx founder, COB, CEO F. Smith, my work "Solved the most important problem facing the start of Federal Express" and the output from my software was "An amazing document".

Later my Ph.D. research was in optimization and, in part, what I'd wished I'd known while at FedEx.

Since my Ph.D. once there a problem in 'resource allocation' that became a 0-1 integer linear programming problem with 40,000 constraints and 600,000 variables. I derived some math, wrote some software, and in 905 seconds on a 90 MHz PC got a feasible solution within 0.025% of optimality.

And there have been other problems.

For the video, a good, first suggestion for an attack on a knapsack problem is via dynamic programming.

By a wide margin, the most important topic in optimization is the simplex algorithm for linear programming. While the algorithm directly solves many problems in practice, it is the most important tool in solving optimization problems that are not linear programming. Usually there the role of linear programming is as a local linear approximation in an iterative scheme of better approximations.

Much of the reason for the power of linear programming as a tool in larger iterative schemes is the several ways can take a linear programming problem and an optimal solution for it, make some changes in the problem, start with the optimal solution for the old problem, and get an optimal solution for the new problem very quickly.

Good starts in linear programming include:

Mokhtar S. Bazaraa and John J. Jarvis, 'Linear Programming and Network Flows', ISBN 0-471-06015-1, John Wiley and Sons.

Vavsek Chvatal, 'Linear Programming', ISBN 0-7167-1587-2, W. H. Freeman, New York.

Yes, the min-cost capacitated (each arc has a maximum flow it can carry) network flow problem, a special case of which is the transshipment problem (find the least cost way to ship some one good from factories to warehouses; also the source of the Kantorovich Nobel prize in economics) is a linear programming problem, and there the simplex algorithm simplifies and becomes the network simplex algorithm. Curiously, and of importance for fast computation, in the network simplex algorithm a basic feasible solution corresponds to a spanning tree in the network. For that problem, don't miss the W. Cunningham idea of a 'strongly feasible basis' or the more recent polynomial algorithm of D. Bertsekas.

A significantly large fraction of real problems in combinatorial and discrete optimization can be solved as min-cost capacitated network flow problems via the network simplex algorithm. A big reason is, if the flow capacities are all integers and if have an integer basic feasible solution, then the simplex algorithm maintains integer basic feasible solutions for no extra work and, with the usual assumptions, produces an optimal integer basic feasible solution. So this is a case of linear programming where get integer linear programming for no extra effort; really, since nearly all the arithmetic is just integer, don't have to be concerned about roundoff error and get the solution for less effort.

Yes, in combinatorial and discrete optimization, along with the rest of optimization, linear algebra helps.

Curiously, one of the more powerful approaches to combinatorial optimization is a technique from nonlinear programming called Lagrangian relaxation, and there commonly the main computational step is linear programming. With that technique, some nonlinear duality theory can yield some bounds on how far are away from an optimal solution; such duality theory is what yielded the bound of 0.025% above.

For combinatorial optimization there is, say,

George L. Nemhauser and Laurence A. Wolsey, 'Integer and Combinatorial Optimization', ISBN 0-471-35943-2, John Wiley & Sons, Inc., New York.

William J. Cook, William H. Cunningham, William R. Pulleyblank, and Alexander Schrijver, 'Combinatorial Optimization', ISBN 0-471-55894-X, John Wiley & Sons, Inc., New York.

In the OP, the emphasis on NP-complete and algorithms that are worst case exponential is a bit exaggerated. E.g., as in the work of Klee and Minty, the simplex algorithm is worst case exponential, but in expectation in practice it is low order polynomial -- for why, there was some good work by K. Borgward.

For integer, combinatorial, and discrete optimization, I have yet to see a claim that getting approximately optimal solutions is in NP-complete. In particular, if in a real problem have saved $10 million and have a bound saying that can't save more than another $2000, then why spend millions trying to save the last tiny fraction of the last penny and guarantee to have done so? If are willing to sacrifice the last penny, then it is not so clear (so far at least to me) that really are being forced into exponential algorithms, either average or worst case.

The claim in the OP that combinatorial and discrete optimization are part of computer science looks like a case of 'academic theft'! Such optimization has long been regarded as part of applied math and there part of operations research. The material has been of interest in, say, departments of mathematical sciences. Some parts of optimization, e.g., control theory, have been regarded as parts of electronic engineering. At times it does appear that academic computer science wants to regard every topic that makes heavy use of computing as part of computer science! It looks like computer science has gotten tired of just quicksort, AVL trees, BNF, YACC, and RDBMS! :-)!

The claims in the OP about the importance of optimization in the economy are a bit over enthusiastic. Maybe the importance would, could, and should be there, and maybe in Australia they are there, but my experience in the US is that there are essentially no significant non-academic career opportunities in optimization. E.g., at one time near year 2000, I had a nice, long talk with one of the leading US headhunters in technical fields, and he finally confessed that in the previous year in all of the US there had been exactly one recruiting request in operations research. He "packed it in" (from the movie 'The Sting', i.e., gave up) and went to law school!

Here is a good future -- although a wildly long long shot -- for integer, combinatorial, and discrete optimization: At present, all but quite simple problems in such optimization are solved by some experts attacking each problem one at a time, exploiting special structure of each problem, trying various techniques, and, hopefully, finally getting some good results. It's custom, one-off, bench scale, expert, creative, hand work and often, really, some applied research.

In addition, there have long been programs that can be used to pose or enter problems in integer, combinatorial, and discrete optimization suitable for solution by a 'solver'. The problem is, since the programs are general purpose, the 'solvers' to be used are also general purpose and, without the hand work, too often in practice are not powerful enough.

Presumably a polynomial algorithm of low degree that showed that P = NP could be the core of a general purpose solver powerful enough to make solving such optimization problems easy and routine.

Lacking such a solver, in practice patience with such optimization has long been a bit thin.

So, what the heck happened? In much of physics, say, planetary motion, we can write down some equations the real system must solve. If the equations have a unique solution, then that solution must be the right one for the real system. At times since early in WWII, such physics was of enormous value for at least the US DoD.

So, given such an equation, solve it. This technique often worked for, say, initial value problems for relatively simple ordinary differential equations. Alas, for the partial differential equations of fluid flow, the Navier-Stokes equations, the situation was usually a bit challenging.

Also in problems for the US DoD, e.g., 'logistics' or how to move a large army across a large ocean in least time within capabilities of existing resources, could write down some equations and other mathematical specifications that a 'best' solution had to satisfy. So, then, to say how to move the army, just 'solve' the equations, etc. This work was called 'operations research', and at times the US DoD was very enthusiastic about it. With enough US Federal Government research grants to academics, parts of academics got enthusiastic about it. And so did Bell Labs.

But there was no law of the universe that said that finding solutions to such mathematical specifications would be easy. Yes, Navier-Stokes is another example of difficulty. Now, in cryptography, so is the fundamental theorem of arithmetic, that is, factoring a positive integer into a product of prime numbers.

Clay Mathematics Institute in Boston has more than one prize of $1 million available for progress on such things.

Net, it does very much appear that for now making money by building a popular Web site and selling ads on it is a much easier way to make money! And, at times, in such work some applied math, possibly new, can be an advantage. But, clearly for now, the grand goals of integer, combinatorial, and discrete optimization are too difficult for 'prime time' in practice in mainstream business.

Or, problem A can make a lot of money if we have a lot of applied math and computing. So, let's attack problem A. Alas, that much applied math and computing, or even just part of the computing, may be much more valuable for solving other problems!

wavesounds
You really wrote the algorithm for FedEx? Thats amazing!

I read your whole post but I'm not sure exactly what you don't like about the class other than you seem to think learning Discrete Optimization in general is a waste of time because theres more lucrative problems much easier to solve.

Care to briefly elaborate on what you dislike about this course specifically (it does appear that Simplex is covered)?

graycat
Part I

I wrote quickly and was not very clear for people without good backgrounds in optimization.

> You really wrote the algorithm for FedEx? Thats amazing!

In the sense of business, it actually was "amazing": Likely it saved the company from going out of business. Too many Members of the Board of Directors were concerned that there could be no suitable schedule. So, one evening Roger Frock and I used my software to develop a schedule for all planned 33 airplanes and all planned 90 US cities.

Printed out, FedEx founder, COB, CEO F. Smith said at the next senior staff meeting "Amazing document. Solved the most important problem facing Federal Express.".

Two guys representing Board Member General Dynamics went over the schedule and announced "It's a little tight in a few places, but it's flyable.". Then the Board was happy. As I recall, $55 million in funding was enabled, not all equity.

So, the software was "amazing" just as business. But as applied math, the software was junk! Work for the real, pressing, short term business problem? Yes. Really optimization? No!

But I did continue and saw that what I should do was (1) generate some thousands of 'feasible' trips from Memphis and back and for each carefully add up the direct costs, (2) set up a 0-1 integer linear programming problem with one row for each of the 90 cities and one column for each of the thousands of feasible trips. Then each row says to serve some one city. Each variable, 1 or 0, one for each column, says to use that trip or not. Right: It's 0-1 integer linear programming set covering, yes, in NP-complete.

So, that's a linear program with only 90 rows. Just as a linear program, that's promising, likely easy.

Make money? My only competition was a guy with a map of the US on a sheet of 8 1/2 x 11" paper. He didn't have a reasonable way even to add the costs of a candidate schedule or print it out. So, yes, my work should have saved a bundle.

If nothing else, just solve the LP without the 0-1 constraints (with only 90 rows, that should be easy), look at the results, and see if have an easy, not very expensive way to patch up the fractions to 0-1 by hand. Else do a little branch and bound. If attacking the whole problem for the whole country is too difficult, then attack it for parts of the country separately; e.g., what do in the East has next to nothing to do with what do in the West; the goal is not to announce that have the 'optimal' solution down to the last fraction of the last penny; instead, the goal is to save money. Don't be embarrassed about not guaranteeing to save the last $1000 or the last $1,000,000 a month; instead be just horrified about not saving the first $10 million a month.

Also do a little duality theory to get a bound on how much more money might be saved. When have saved the first $10 million and have no more than another $1,000 to save, take the best feasible solution so far and quit.

About then I'd gotten on an airplane and run off to chat about optimization with G. Nemhauser, then still at Cornell, J. Elzinga, then still at Hopkins, and J. Pierce, then in Cincinnati, got a stack of books and papers, etc.

There was also another problem: Our fuel prices and availabilities varied widely by airport. So, a question was, how much fuel to buy at each stop? So, buy extra fuel where it is cheap and available. This is called 'fuel tankering'. But, yes, will burn off some of the extra fuel carrying it.

How much extra fuel is burned off is fairly sensitive to the vertical flight plan used; so also need to select vertical flight plan in a coordinated way. The problem is also complicated by the fact that package pickup loads are not known until the plane arrives for the pickup, and those loads will affect the fuel burned for each candidate vertical flight plan. Then those loads will also affect how much fuel can be carried. Also relevant is that the fuel burned and flight time for a vertical flight plan and a load are affected by essentially random winds, air traffic control, and flying around summer thunder storms. Also really get charged not just for fuel but also time on the engines. So, it was complicated.

Now, try to find a way in practice to advise the pilot on what to do? So, right, it's another optimization problem -- partly discrete, nonlinear, sequential, stochastic. So, right, stochastic dynamic programming. Doable? Actually, yes, quite doable. On a computer today, could solve the problem for, say, five stops, with weight discretized at 100 pounds in about the time it takes to get a finger off the Enter key or the mouse button.

Also for the vertical flight plans, I went to MIT and chatted with M. Athans about deterministic optimal control theory.

graycat
Part II

There had been another place I'd saved the company from going out of business: Our two representatives from Board Member General Dynamics (GD) had packed their bags and were on their way back to Texas, which would have killed the company, when Roger Frock gave me a call and I went to the Board Meeting and explained some revenue projections I'd done with M. Basch. The GD guys were happy; the GD check was good; and the company was saved again.

But my offer letter promised founder's stock, and so far I had no stock. My wife was still in her Ph.D. program at Hopkins. Our home was still in Maryland, and I was flying jump seat home each few weeks. Also my computer access, PL/I on VM/CMS, no doubt by a wide margin the best computing then available for such work, was good in Maryland but sucked in Memphis so that for the software I had to be in Maryland which torqued one guy (not Smith) in Memphis. Also, Smith was not really happy about it.

I wanted a piece of paper, stock or Ph.D. On my last day Smith said "You know if you stay you are in line for $500,000 in Federal Express stock.". He wasn't putting that in writing; before I joined I was told by an SVP that I'd get the stock in "two weeks", and that was already too optimistic by over a year; I didn't know how serious Smith was; I didn't know if the Board would go along; I wasn't sure how much software I'd have to write for the optimization I had in mind or how patient Smith would be as I wrote the software and tried this and that in the optimization; and I was not sure how happy Smith would be about the likely considerable computer charges I'd run up.

But there was money to be saved: I'd written the first version of the software totally ASAP, fingers flying over the keyboard. There were some simple tweaks that could have helped save a lot of money, likely right away enough to pay for the computing I needed for the optimization. And in the optimization work, some early results, e.g., just the careful cost calculations, could have saved much more money than I needed for the optimization. And I believe that there was a fairly easy way to do the fuel buying problem to get it saving money quickly. The money to be saved just from my typing in some software was astounding. Actually, from what I learned later in graduate school, the optimization should have been not too difficult and saved a bundle, enough to make a major difference in the bottom line of FedEx.

But Smith wasn't putting the stock in writing, and my wife was in Maryland. So, I left and got a Ph.D. in optimization.

> I'm not sure exactly what you don't like about the class

I watched the preview lecture.

(1) The emphasis on the knapsack problem is misleading for practice -- really mostly contrived. For practical problems that really are knapsack problems, the technical fact that the problem is in NP-complete is not very important; among the NP-complete problems, in practice knapsack problems are among the easiest to solve; the usual recommended approach is via dynamic programming. The claim that knapsack problems encounter exponential running time is over emphasized to unrealistic.

The professor was over hyping the material in ways that are misleading. Bummer.

This stuff about NP-completeness is too often used in ways that are totally misleading in practice. Basically some professors are 'bloviating', trying to impress people with how difficult their work is.

Such hype can be seen as an attempt to intimidate others, and one cost can be that others get resentful and just decline to get involved with optimization. Related is the long, common emphasis on 'optimal' as if saving the last penny was some high moral objective worth much more than one penny; that emphasis was, again, a way to intimidate others and, thus, cause optimization projects to be neglected. The OP is falling into those old traps. Bummer.

Such nonsense goes back to the cartoon early in

Michael R. Garey and David S. Johnson, 'Computers and Intractability: A Guide to the Theory of NP-Completeness', ISBN 0-7167-1045-5, W. H. Freeman, San Francisco, 1979.

where the optimization guy says to the business manager that he (the optimization guy) can't solve the manager's problem but neither can a long line of other optimization experts. Nonsense, 99 44/100% total, made up, flim-flam, fraud nonsense. Why? The business manager likely cared essentially only about saving the first 90% of the cost savings from an 'optimal' solution, nearly always in practice, and for the rest was quite willing to f'get about it; what he wanted was likely quite doable; and nearly all the difficulty the optimization guy was talking about was for the parts the business manager was willing to f'get about. Really, the optimization guy was not looking to solve the business manager's problem but looking for a lifetime job pursuing academic prestige at the business manager's expense. The OPs emphasis on the difficulty of his work is coming way too close to this old mistake.

E.g., at a start up in Texas, I mentioned, as in my first post in this thread, I'd gotten a feasible solution within 0.025% of optimality for a 0-1 integer linear program with 40,000 constraints and 600,000 variables in 905 seconds on a 90 MHz computer. Then the group of people I was talking to, heavily from SMU, flatly refused to believe my statement; they were convinced that due to NP-completeness theory I had to be lying. I was telling the exact truth, and NP-completeness theory in no way contradicts what I said. NP-completeness theory is about exact optimality, down to the last tiny fraction of the last penny for worst case problems, the worst case that can exist even in principle, and that context is a very long way from using optimization to save money in practice. Sure, it might be super nice and valuable to have a fast, low degree polynomial algorithm that shows that P = NP, but lack of such an algorithm does not say that our optimization problems are too difficult in practice, especially if all we want to do is save millions of dollars and are willing to sacrifice the last 10% of the savings.

I remember when I was at FedEx and thinking of going to Brown for my Ph.D. I visited the campus and ate lunch with two professors, one who was eager for me to come and the other just the author of a text I'd long since read carefully. When asked what I was doing at FedEx and explained the fleet scheduling, the text author responded with contempt "the traveling salesman problem" as if the work was hopeless. No, not in any very meaningful sense. The goal was to save money, and that was quite doable, NP-completeness theory or not. That he wanted to use some tricky point about NP-completeness theory as an excuse not to save a significant fraction of the FedEx costs, millions a year, was a major factor in my not going to Brown. We have to wonder how that professor even tried to get from home to lunch that day since he believed that to do so he had to solve the traveling salesman problem.

The OP's emphasis on NP-completeness to claim how difficult were the problems he was solving was nearly as objectionable. He was being misleading. Bummer.

Again, nearly always (sure, if the problem is SAT, then an approximate solution may be of no interest) the goal in practice is to save money; the difficulty of saving the last penny, always, worst case, guaranteed, is no reason not to save the millions that can be saved in nearly all practical cases for likely quite reasonable effort and possibly some astounding ROI.

Net, the NP-completeness theory is far too often used to claim that such optimization is "hard", but for saving a lot of money in practice that's often just wildly false.

Indeed, as I mentioned in my first post in this thread, we are not afraid to use algorithms that are worst case exponential because simplex is worst case exponential. To show just how far from reality NP-completeness theory is, as I mentioned, on average in practice simplex is low order polynomial.

(2) The claim by the OP that if can solve one NP-complete problem with a 'good' algorithm, then can solve them all is, sure, true in principle and nice to know but not very important in practice and nothing to emphasize in that introductory video. Here the OP was hyping his work in a misleading way. Bummer.

(3) The OP's claims that optimization is a big deal in practice are hype and misleading. Bummer.

The problem with optimization playing a big role in practice was illustrated there at FedEx: Smith had some huge reasons to have me pursue the optimization. He didn't support my work nearly well enough, and the main reason was that he just didn't have faith that he should make that part of his company the work of some technical experts doing work he didn't understand (read that statement several more times and fill in with what we can expect from emotions, ego, sense of control, Smith's pride in the paper he wrote at Yale, possibly some resentment for academics, his family fortune he'd invested, his long time associates he'd wanted to count on, promises he'd made to various people, his image before the 'suits' on his Board, etc.). Law and medicine have such professional respect; optimization does not.

graycat
Part III

In the end, it's super important to be the guy who OWNS a business and SELLS the results. E.g., for optimization, maybe develop the software for free, show the results and the savings, and then ask to get paid a fraction of the savings. Let's see, long ago one commercial airline was spending $89 million a month on jet fuel. I can believe $200 million a month now, also for FedEx. From a fast Google, get to

     http://www.transtats.bts.gov/fuel.asp
with data for US airlines

     2013 April 847.5 2,432.4
or 847.5 million gallons costing $2,432.4 million dollars in April, 2013.

Save 15% of $200 million a month and get paid 20% of the savings and get paid $6 million a month, from just one customer. And it's an easy sale: Take case A, what they are doing now, and cost it out. Then take case B, from optimization. Then compare costs. Simple. Compelling. Maybe not still compelling now, but would have been for much of the history of FedEx.

But, for my doing an in-house effort, Smith didn't take the work very seriously. Right, in the next year I might have burned off $50,000 in VM/CMS time sharing computer charges. Right. But jet fuel is expensive, too.

> it does appear that Simplex is covered

Yes, of course, the course will have to, but the introductory lecture didn't emphasize that.

In a sense, simplex is dirt simple -- just elementary row operations very much like in Gaussian elimination but, using the 'reduced costs', selected to make money at each iteration in, if you wish, a 'greedy' way. But there's more, e.g., some surprising points about moving along edges of a closed convex set, from an intersection of finitely many closed half spaces, from extreme point to extreme point. For the course, discrete and combinatorial optimization, really should know simplex quite well; it promises to be the core of the course. Also, again, simplex is worst case exponential!

For the career prospects of the course, only a tiny fraction of college courses have good, immediate, direct career contributions. So, I can't be offended that the OP's course does not have really good career contributions. But I am offended that the OP tried to claim that his optimization was so important that there would be good career prospects. Sure, Bixby (of C-PLEX) bought a nice house in Houston, but mostly I'm still looking for the yachts, or even the nice houses, of optimization experts. Heck, even job ads would be reassuring.

I have one of the best Ph.D. degrees in optimization, and it has been essentially useless for my career. The core reason is, the business guys with the projects and budgets don't understand optimization, have no respect for it, and don't want to bet part of their careers on it. There's usually little or no downside for ignoring optimization. For pushing a project in optimization and failing, there's a lot of downside. For being successful, there will likely be resentment, attacks from other managers who feel threatened, etc. and otherwise no great upside. So optimization projects are about as popular as a skunk at a garden party.

One final war story: Long the dean of engineering at MIT was T. Magnanti, an expert in optimization. Once he gave a Goldman Lecture at Hopkins on optimization of the design of large IP networks. From some old Bell Labs data (from some work likely close to the book with the cartoon), optimization should be able to save ballpark 15% of network capital expense; worthwhile money if can get it.

So, at one time there was a start up in Plano, TX attacking this problem. At the time, so was I. So, the TX people flew me down for an interview. They had some venture funding, and it may be that some of the people who met me were the venture partners. The company's main optimization guy from SMU had just bowed out. The CEO was a former IBM guy, and they flew me down partly because of my role at FedEx and also because I'd been at IBM's Watson lab.

So, I arrive. I'm met at the door by the CEO, the IBM guy. Right away he scowls, and I never see him again. Why? Because I didn't know his name (I'd had no communications with him), and my handshake was not impressive enough. He desperately needed some good expertise in optimization, e.g., in a back room had a high end PC with a copy of C-PLEX gathering dust while his people were trying total enumeration, but he wanted nothing to do with me. I'm not that hard to take, not even while tired from a plane trip and driving from the Dallas airport to Plano.

The point was, he was convinced that his IBM white shirt and IBM salesman handshake were what were crucial for his company and that my background in optimization was, well, whatever but likely not really good like a handshake. He had no respect for optimization. Soon he was 'promoted' to just a Board slot.

My background in optimization? Did I mention Goldman? He was the Chair of the committee that approved my Ph.D. research. On the committee was C. ReVelle, world expert in optimization for facility location (mentioned by the OP). Also on the committee was J. Cohon, world expert in multi-objective optimization and long President at CMU. My research was from a suggestion in three words by G. Nemhauser, world expert in optimization. One paper I'd published solved some long outstanding questions in the Kuhn-Tucker conditions in optimization and solved a problem stated in the famous paper by Arrow, Hurwicz, and Uzawa. I went through one of the best Ph.D. programs in optimization on the planet. Still, the CEO in TX wanted nothing to do with me. And, really, after my Ph.D., neither did FedEx.

When I left FedEx, I'd saved the company twice and for more had identified, formulated, done good first-cut progress on, and presented the three optimization problems, fleet scheduling, fuel buying, and vertical flight planning. All I needed was a little consulting money and a good VM/CMS time sharing account from my home in Maryland, but that was not enough to get Smith impressed. So, I lost, and so did Smith and FedEx.

In business, optimization is a Rodney Dangerfield field -- it "don't get no respect". So, if exploit optimization, then do it for your own company or sell just the results based on the savings obtained: Since the 'suits' are convinced that optimization can't save much, when the contract is signed they will believe that they won't have to pay much. When they have to pay $6 million a month, they will be surprised and pleased by how much they are saving but bitter and furious at the $6 million a month they are signed up to pay.

There is a secret in business: To get paid well without too much resentment, jealousy, etc. from others, get paid in ways that others don't really know how much you are making. So, if some big company has to pay you $6 million a month, even if you are saving them $30 million a month, they likely will be torqued; but if can get the $6 million a month from several ad networks from running many millions of ads, then no one will get torqued.

amhey
graycat - remember this is a 7-week course and the lecturer, Pascal Van Hentenryck, in the initial video is stirring up interest imitating Indiana Jones - he's on a quest, he's passionate. So it's not the Hillier and Lieberman textbook approach to Operations Research of old. The whole point was to make the difficult problem of combinatorial problem solving fun and attractive to a wide audience.

I agree with you that many managers haven't a clue what to do with an OR graduate. So it does depend on how you position yourself. One reason for taking the course is to be able to promote your skills passionately (try making a video of your skills in every day parlance like the introductory video to this lecture) using current terminology - which as Prof Van Hentenryck says are in demand at INFORMS conferences (go to the analytics conference - rather than the main conference which is more academic).

I agree with you that many execs haven't a clue what optimization can do - or care - in fact most managers satisfice. That's why it's important to find the right boss who does value your accomplishments. It's also important to update how you state your value to others - which is one reason I'm doing the course. I hope you stick with it - and I'd be interested to know what you think at the end of it.

Is the point of your argument that the leaderboard ranking forces people to try for an optimal solution, when in a business situation a solution at 99% of the value might be good enough?

graycat
The Indiana Jones take off is fine. Mentioning the knapsack problem is less good because it's not so important in practice. Saying that the knapsack problem is difficult to solve, e.g., encounters exponential algorithms because technically it's in NP-complete, is next to irrelevant for practice, misleading, and hype and not fine.

> it's also important to update how you state your value to others

On this, I outlined my suggestion: Own a little company and sell results based on how much money they save the customer. Make the sale about saving the customers money in ways that even an auditor can confirm are correct.

INFORMS is clearly an echo chamber, people in optimization looking for work and talking to themselves.

Broadly for optimization in business, there is a very serious problem: Optimization is not a 'profession' like law, medicine, or major parts of engineering. So, there is no licensing, certification such as the CPA, peer-review of practice, legal liability, etc. So, as I said, the field "don't get no respect". Also missing is a point the legal profession has: Any working lawyer must report only to a lawyer; the interface between the optimization guy and the business guy is nearly impossible.

> Is the point of your argument ...

I tried to make several points. One of the points was about 'optimal'. The mathematical definition is fine, but long that definition was taken as suggesting that what we should do in practice is look for such solutions, then strain to find them, etc. That turned out to be a grand mistake.

Why? Because maybe there is, compared with what the customers is doing now, $10 million to be saved with an optimal solution. But too commonly saving all $10 million is too difficult for the algorithms and computing. So, straining to save all the $10 million converts an important business problem into a much more difficult mathematical problem. It also turns out that commonly it's fairly easy to save, say, $9 million. The difficulty is saving the last of the money, and the most difficult money to save is the last, say, the last 10 cents.

'Optimal' was taken as a moral objective, as I said, as if saving the last 10 cents was worth much more than 10 cents.

Struggling over 'optimal' taken literally and, thus, making real problems much more difficult than necessary, was several torpedoes below the waterline of the ship of optimization.

Part of this mistake was the simplistic and excessive emphasis on NP-completeness -- for real problems the whole P versus NP question is next to irrelevant. One way to see this is the simplex algorithm -- it's the core of optimization and astoundingly fast in practice but worst case exponential. There is a polynomial algorithm for linear programming, but it's way too slow in practice. In practice, that an algorithm is worst case exponential is commonly just irrelevant.

I had to conclude that for business, optimization is a dead field. It got started due to WWII and US DoD funding, and maybe in places there is still some interest for US DoD problems.

Here is a little: A post above, in response to a post of mine, claimed that IBM had a good optimization group. If so, then good for IBM. But I was at IBM's Watson lab, published a paper on optimization, and off and on considered joining the optimization group there. Phil Wolfe, William Pulleyblank, Ellis Johnson. and others were in that group. At one point, Roger Wets was visiting. The group did the IBM Optimization Subroutine Library (OSL). Then in 3 years near 1994, IBM lost $16 billion. Johnson joined George Nemhauser at Georgia Tech. Pulleyblank became a professor at West Point. Basically the optimization group fell apart. Maybe they put a group back together, but losing Johnson and Pulleyblank were big mistakes.

E.g., again, with Pulleyblank at West Point, the US DoD remains interested in optimization.

Heck, I supported myself and my wife through our Ph.D. degrees by working in optimization for the US DoD.

In academics, the professors were to do research to get the field going, e.g., research in 'systems analysis', 'mathematical sciences', 'civil engineering', 'production', etc. Yes, if optimization problems were easy to solve, then optimization would have central roles in those fields. Alas, mostly important practical optimization problems are not so easy to solve, even approximately. So, the professors are still doing research -- maybe in some decades or centuries they will have something of serious importance for those fields. I doubt that the research is very well supported.

I tried to give a summary of essentially a 'cultural contradiction' expecting optimization to be a popular field in business: By the time computing is ready to make optimization easy enough, there are other things to do with the computing making much more money than with optimization.

It's not that optimization can't save money in business; there is money to be saved; in a lot of stable businesses, optimization can provide some of the highest ROI available to the business. So, there is some ground available there, what is in principle some fertile ground. So, there can be some optimization groups here and there. If the course prof has such a group in Australia, good for him. With some really impressive 'cases' published in, say, INFORMS, maybe mainline business will try optimization again. I doubt it, but maybe. Don't hold your breath waiting; there are lots of impressive cases long since published in INFORMS, and ORSA, Mangement Science, etc. The optimization literature is huge going back to the late 1940s, e.g., for Dantzig at Rand and Berkeley.

Here's a little on the difficulty: In the US there are college accrediting groups, and for some years they said that an undergraduate degree in business should have courses in optimization and statistics. So, for years each business school student, undergraduate or MBA, got a course in optimization. For some years, I taught such courses. Still the field didn't take off.

I can't recommend that anyone try to have a career in optimization in business. You stand to have an easier time supporting a family with a career as a plumber, literally. With software, do an information technology start up, sell out, and pocket, say, $10 million -- knocks the socks off optimization. With irony, if interested in 'optimization' of your career and financial security, then avoid optimization!

Optimization is like some item at Dunkin Donuts that doesn't sell. Lots of other stuff at Dunkin Donuts sells really well, but that one item just doesn't. They can do a good job getting the item ready to eat, put it out in the display cases, and wait, and what happens is the item just sits there and goes stale. Then they throw away the stale, unsold items. It was a waste. Finally, Dunkin Donuts just quits offering the item.

Dunkin Donuts doesn't go on and on about why the item really should sell. Instead, they just listen to the clear message they've gotten from the market and, thus, save having to figure out solid reasons it doesn't sell.

Similarly all across business -- some stuff doesn't sell or doesn't sell very well or sells only a little and then only into a tiny market. Optimization in business is like that -- at best, it's a super tough sale; usually it just doesn't sell.

Optimization, as a field, in business, is a dead duck. F'get about it and pursue something else.

tadworthington
Narcissistic babble.
graycat
It's not about me, not at all.

It's about the course and optimization as a field, especially in business.

So, I contributed.

Since I've been there, done that, got the T-shirt, I'm able to make some contributions few others can, but to make these contributions I have to draw on some of my personal experiences. It's not about me or "narcissistic".

And, it's not "babble": Instead, in some situations it's darned valuable information, that I very much wish I'd had long ago. With that information, I would have avoided trying to have a career in optimization. Much of my Ph.D. is in optimization; some of the rest of my Ph.D. may yet prove to be valuable, but the optimization part was essentially a waste. Yes, we expect some useless chaff in with the good wheat, but still we don't like it. Optimization cost me a lot.

Broadly optimization is a siren song, especially for people with at least one foot in computing. Since in principle and sometimes in pracitice the field can save money, enough to give quite high ROI, the song sounds good. Alas, mostly the song is not good but an invitation to disaster, to taking a career into a swamp.

I gave you some rare and good insight that could be quite valuable; ignore it if you wish.

"Experience is the great teacher, and some will learn from no other".

I learned about optimization from experience; it's the very expensive way to learn; if you throw away what I reported from my experience, then you get to take the expensive way, also.

noobakgn
There are local and constraint-based search techniques that are useful for particular situations as well.
antman
Here is the instructor Pascal Van Hentenryck answer from the coursera forums:

"Thanks for pointing this out.

First, we obviously cover linear programming and mixed-integer programming in this course. These techniques are indeed very successful in practice and I hope that you will also enjoy these lectures. They make up a third of the class. The references

    Integer Programming by Laurence A. Wolsey
    Integer and Combinatorial Optimization by Laurence A. Wolsey, George L. Nemhauser
    Large Scale Linear and Integer Optimization: A Unified Approach by Richard Kipp Martin
    Introduction to Linear Optimization by Dimitris Bertsimas, John N. Tsitsiklis
    Understanding and Using Linear Programming by Jiri Matousek, Bernd Gärtner
    Theory of Linear and Integer Programming by Alexander Schrijver
are also provided in the additional material. The work mentioned in the introductory lectures include many applications that are based on mixed-integer programming. So I am not sure where this person is coming from with her/his comment.

This being said, constraint programming and local search are also very successful in practice, often for different types of problems. So this class aims at giving an introduction to a subset of the techniques that are successful in practice (there are other too) as an introduction to the field. My research group uses all three of these techniques to solve large-scale problems in logistics and supply-chains, energy, and disaster management. Some of the problems we solve require the three approaches together to achieve high-quality solutions. To paraphrase John von Neumann defending George Dantzig: "If this technique applies to your problem, use it; otherwise do not".

We also cover column generation and large neighborhood search in the advanced topics. I used to cover Lagrangian relaxation in my class at Brown and we may add a lecture on this. This class has almost all the material of my optimization class at Brown University but not all, since it is a shorter. We may still add an advanced lecture on this topic (I have an additional motivation now)

With respect to the job market, I would simply say: Go to the Informs conference and see how many people, both from academia, industry, and government, are recruiting. Companies such Google, IBM, Amazon, ... have outstanding groups in optimization. Follow Mike Trick's blog and tweeter accounts to get a sense of how lively this community is (see the community link on the page). I never had a student not find a job and most of them did not go to academia. One of them actually works for ... Fedex. My sense is that the opportunities for optimization keep growing and it is an exciting time for optimization. There are many startups, established companies, large multi-national organizations all doing optimization. Some focus on solvers, some on vertical products, some on dedicated applications, and some of several or all of these aspects. The field has progressed significantly.

There is a lot of work, and a lot of progress, in making optimization more reusable and the tools/solvers are now much better in being more generic. Unless P=NP, this will always be a challenge but there are more generic tools to solve classes of applications. There are also solvers that provide a lot more flexibility to build dedicated algorithms much more simply. Many people (including me) spent their research and academic careers trying to design such tools and some make a significant difference in practice.

Finally, optimization is a multi-disciplinary field. My group employs mathematicians, physicists, engineers, and computer scientists. What I personally find incredibly stimulating in this area right now, is that sometimes you need a physicist, a mathematician, and a computer scientist together to solve a complex problem. We all look at a problem from different angles and there is tremendous values in that. I am part of the INFORMS, artificial intelligence, and computer science communities, and some of the scientific contributions I am most proud of are typically those when I was able to bridge two fields. Many of the techniques in optimization come from a variety of fields and this is also what makes it exciting.

And, yes, I am exciting about the future of optimization,

I hope it helps."

graycat
If the course professor can have a flourishing, productive group in optimization in Australia, then good for him, his group, and the good judgment of his customers.

In the US, optimization was a field with lots of effort back at least to Dantzig at Rand in the late 1940s. The main push for the field was just the US DoD.

For US business, there have been some niche applications, but the overall situation has long been just as I described -- the field "gets no respect". With rare exceptions, people just don't want it. Elsewhere in this thread I've given nearly exhaustive reports of why basically in US business optimization is a dead duck.

wavesounds
I love how animated and excited this guy is about his subject!
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
HN.Academy is an independent project and is not managed or owned by Y Combinator, Coursera, edX, or any of the universities and other institutions providing courses.
~ [email protected]
;laksdfhjdhksalkfj more things
hn.academy ~ Privacy Policy ~