A series of challenging mathematical/computer programming problems that will require more than just mathematical insights to solve. Although mathematics will help you arrive at elegant and efficient methods, the use of a computer and programming skills will be required to solve most problems.It has been a lot of fun to code these up in my language du jour, Python. There are a couple of problems that Python's built in libraries have made trivial. I have to admit the most enjoyable part for me is having problems that require efficiency in algorithm. For the simpler problems, I usually just quickly hack together the "naive" brute force method, figure out that it doesn't scale and then start investigating how I can fix it. Doing this, you will exercise your mathematics, computer science and programming skills, something that a lot of programming doesn't do. I convinced my girlfriend to work with me on one of the exercises, and of course she picked one of the Prime Factorization problems. The naive brute force algorithm would not be an option for the large composite number given, so we ended up hacking together a Sieve of Eratosthenes. Ultimately, we got a version working, but it was still pretty inefficient, only returning the answer in about an hour. An optimal version should be able to do it within seconds. Obviously there is some "refactoring" to do.
Still, the C++ backend wasn't fully supported and required installing libraries and was complicated. Not 100% what I needed or wanted. All of which got me thinking about domain specific languages. Most programmers don't consider it, but SQL and Regular Expressions are good examples of Domain Specific Languages (DSL), as are lex and yacc/bison. Up till now, I've frowned on the whole idea of DSLs in general. It had always seemed like bad software engineering practice to invent a new language for each problem. After all, did we really want to learn an entirely new programming language with each assignment? Who is going to maintain the code? However, the facts point out that you have to learn an entire API anyway, and the API really just layers over what you're really trying to do with a language that wasn't quite expressive enough to do the job natively to begin with. Which of course leads me to LISP and through Martin Fowler who makes some good points here:
"One of the most obviously DSLy parts of the world is the Unix tradition of writing little languages. These are external DSL systems, that typically use Unix's built in tools to help with translation. While at university I played a little with lex and yacc - similar tools are a regular part of the Unix tool-chain. These tools make it easy to write parsers and generate code (often in C) for little languages. Awk is a good example of this kind of mini-language."While I've been using SQL, regular expressions, awk, lex, and yacc for years, I'd never really classified them in my mind as DSLs. I've been well aware of the power of small specialized utilities aggregated together to perform a bigger task and why UNIX has been so successful at this, but I hadn't made the leap to apply this to my programming. Fowler continues:
"Lisp is probably the strongest example of expressing DSLs directly in the language itself.. Symbolic processing is embedded into the name as well as practice of lispers. Doing this is helped by the facilities of lisp - minimalist syntax, closures, and macros present a heady cocktail of DSL tooling. Paul Graham writes a lot about this style of development. Smalltalk also has a strong tradition of this style of development."I've heard "grey-beards" and academics talk about the power of Lisp for years, and though I did some trivial functional programming in college, I've dismissed the rants of the Lisp guys as nothing more than rants. Today though, the ideas are crystallizing in my head, and I'm excited to explore this more.
I love snarky bug reports for some reason. It cracks me up that it took 8 years for Sun to add password prompting to Java. The users increasingly becoming irate in the bug reports is awesome. I wish the programmers would have responded back in a big flame war. I can only imagine what they were saying inside Sun. Good stuff.
This is intended as a beginners tutorial for learning Haskell from a "Lets just solve things already!" point of view. The examples should help give a flavour of the beauty and expressiveness of Haskell programming.
"The Java language does not offer any way to explicitly allocate an object on the stack, but this fact doesn't prevent JVMs from still using stack allocation where appropriate. JVMs can use a technique called escape analysis, by which they can tell that certain objects remain confined to a single thread for their entire lifetime, and that lifetime is bounded by the lifetime of a given stack frame. Such objects can be safely allocated on the stack instead of the heap. Even better, for small objects, the JVM can optimize away the allocation entirely and simply hoist the object's fields into registers."
Next Page »