Clicky

I was wondering why is it in my 30 years of programming experience I've never seen a book which discusses the importance of balancing Run Speed vs. Design Speed vs. Reusability.

Reuseable vs. Run Speed
I've recently read numerous books on C# and object oriented code which all stress making code reusable. None of them mention the end result might run so slowly your code will be useless.

The classic Design Patterns book by Gamma, Helm, Johnson, Vlissides (GoF) gives an example (section 2.8) of a text editor named Lexi where they put each character in a separate container, making it easy to program, but the end result would run so slowly the code would be useless.

In fact I've never heard of Lexi. A brief search of the internet reveals it never actually existed. Why teach what in theory is great but in practice is useless?

Algorithm Speed vs. Run Speed
Knuth wrote 3 volumes analyzing algorithm "speed", e.g. how many algorithm "steps" it takes to sort an array, but I've never seen an analysis of algorithms which take into account virtual memory paging, which can be extremely important factor in Run Speed.

For Example:
This takes 2 seconds to run:
       int[, ,] a = new int[600, 600, 600];       for (int i = 0; i < 600; i++)         for (int j = 0; j < 600; j++)           for (int k = 0; k < 600; k++)             a[i, j, k] = 1;  //i,j,k order                             
1: 2: 3: 4: 5: 

Select allOpen in new window


This takes 14 seconds to run:
       int[, ,] a = new int[600, 600, 600];       for (int i = 0; i < 600; i++)         for (int j = 0; j < 600; j++)           for (int k = 0; k < 600; k++)             a[k, j, i] = 1;  //k,j,i order                             
1: 2: 3: 4: 5: 

Select allOpen in new window



Design Speed vs. Run Speed
I recall there was a company whose president insisted all their programmers code exclusively in assembly language so their programs would run fast. Their programs ran very fast. However it took their programmers much longer to code everything in assembly language, making their programs more expensive and the company eventually closed.

So are there any books which do include consideration of run speed when designing a program?


(P.S. Here's full sample code:)
 using System; using System.Text; using System.Diagnostics;   namespace LoopSpeed1 {   class Program   {     static void Main(string[] args)     {       int[, ,] a = new int[600, 600, 600];        Stopwatch stopwatch = new Stopwatch();       Console.WriteLine("Begin");       stopwatch.Start();        for (int i = 0; i < 600; i++)         for (int j = 0; j < 600; j++)           for (int k = 0; k < 600; k++)             //a[i, j, k] = 1;  //i,j,k order             a[k, j, i] = 1;    //k,j,i order        stopwatch.Stop();       TimeSpan ts = stopwatch.Elapsed;       string elapsedTime = String.Format("{0:00}:{1:00}:{2:00}.{3:00}",     ts.Hours, ts.Minutes, ts.Seconds,     ts.Milliseconds / 10);       Console.WriteLine(elapsedTime);       Console.ReadLine();     }   } }                             
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24: 25: 26: 27: 28: 29: 30: 31: 32: 33: 

Select allOpen in new window


asked 11/18/2011 12:48

deleyd's gravatar image

deleyd ♦♦


3 Answers:
I'm a big fan of 80/20. It can be applied to almost anything like this  Do 80% of the speed improvements with 20% of the effort. Good enough. Code quickly, go back and Improve the areas that need it.
link

answered

aarontomosky's gravatar image

aarontomosky

Try searching academic papers instead of books. You should find a lot more on the subject. Google Scholar is a good starting point.
http://scholar.google.com/scholar?hl=en&q=programming+tradeoffs&btnG=Search&as_sdt=0%2C26&as_ylo=&as_vis=0
link

answered 2011-11-18 at 21:05:24

TommySzalapski's gravatar image

TommySzalapski

That's a great analysis of the different tradeoffs deleyd, and I've not seen this discussed in books.  In some sense one of the main theme behind agile/XP has been to promote design speed.  Don't build it until you need it, refactor often etc. all improve design speed and should be largely orthogonal to run speed.  

Algorithm speed vs run speed is a great observation and as modern machines become much more complex "under the covers" with multiple processors, multiple execution paths within a processor, complex cache behaviors etc. it's much harder to say that an efficient algorithm design will necessarily lead to good run time performance.

Along the same lines of what you're raising, I think the old adages like "avoid premature optimization" need skill to apply correctly.  At some level, all software design is about achieving reasonable performance.  You could (and if you get into the turning machine stuff you can really see this) build a program with virtually no state - computing everything from first principles whenever you need it - but the performance would be spectacularly slow.  So all engineers are engaged in the design speed vs run speed tradeoff continuously even if they don't realize it.

So very interesting points you raise, but I can't say I've seen much direct discussion of these in the literature.

Doug
link

answered 2011-11-19 at 05:58:04

dpearson's gravatar image

dpearson

Your answer
[hide preview]

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Tags:

×2
×1
×5

Asked: 11/18/2011 12:48

Seen: 261 times

Last updated: 11/28/2011 04:06

Categories