Leica DOS rules, BABY!

Brian

Product of the Fifties
I've been "repurposing" the Celeron 2.4GHz XP machine that I bought for Nikki almost 10 years ago to play games. She's moved on, and I still write software for DOS. WIN98 is the last OS from Microsoft that can be booted into "real-Mode", which is perfect for doing embedded software. I've used older machines -Pentium Pro- for this for years.

SO- a week of learning tricks, 8GByte boot partition "OR ELSE" the computer turns itself off on installation;
Limit WIN98 to 768MBytes in the system.ini; limit cache to 256MBytes in order run run DOS boxes and some software. You may have to play with these- I did a lot of trial and error to max it out. Have it working with 768MBytes now.

Get a computer that runs DOS using 1.2GBytes and Windows using 512MByte that does not turn itself off. Just wrote a DOS program that uses 800MBytes of memory. In FORTRAN, baby! I hit the 640KByte barrier in 1988 and blasted right through it using Phar Lap DOS and Microway compilers.

The array sizes are limited to 64MBytes per array, but that is big enough to load the 36MBytes files from the M9 and M Monochrom. I used to write a lot of my own image processing code.

Why DOS? My code re-writes the interrupt vector table, and takes over the counter-timer registers of the CPU. OS's usually don't like that.

DOS RULES, Baby!
 
DOS or rather operating systems like DOS are still commonplace. I worked several years ago on a test system which was used for testing control systems on trains, that was a DOS-alike RTOS. Easy to write the code nightmare to debug it when it doesn't work. Still though I love real time as there is a certain smug happiness in showing the young guns how to make stuff work using the ole black magic learned from years of figuring out how to make this stuff work.
 
I get a lot of "how did you do that." I have a couple of younger engineers that want to learn this stuff, been looking to get the tools running under "old computers" rather than "ancient computers". The alternative- I use a PC104 stack at work, but more expensive than picking up a Pentium 4.
 
Just hit the Needham Electronics web page!

Home

I guess they are back? They were offline for a while, heard they went out. Device Burners from Digikey were out of stock when checked a few years ago.

I switched to Dataman at work after not finding a replacement for the EMP-30.
 
I have an extensive set of DOS programs on my dalethorn site. The toolkits, mostly in QB4 (similar to ANSI 'C', which may sound peculiar until you see the examples), regard each compiled module as a command-line object callable from any other code, and which can call any other code. Does not run on Win7-64 unless using WinXP mode VPC etc.

There's a text indexer, a Btrieve browser with utilities, an encryption program offering 15k to any institution that wants to try cracking it in a chosen plaintext attack, and a master toolkit that does a lot of stuff. The conversions from IEEE floats to MBF and back to IEEE are in plain BASIC and run quite a bit slower than routines written in assembler, yet at 250,000 to a million conversions per second, fast enough in BASIC to use in business software (where it is used).
 
I checked out the site, always ready to download free sourcecode!

I used "CBasic 2" in CP/m, 30+ years ago. Elements of C mixed with Basic, from what I've read of Visual Basic- looks similar. It produced "P-Code", and had a low-level interpreter. But once I got Microsoft FORTRAN-80 for CP/m, did not use it anymore. I even wrote my own graphics package that drove a DEC VT-241. Switched over to DOS and the Intel line because of the math coprocessor. I have stuck with FORTRAN+assembler, even for the system-level programming. A by-product of a DEC VAX/VMS background. FORTRAN programs that did their own page-management; had to because I set the priority above the pager and swapper. Once Phar Lap came out with their VMMDRV, ported all of that to the PC. Mapped the graphics memory to a FORTRAN common block, and disk files to arrays for fast read-in. This was almost 25 years ago. The code still works beautifully, the processors have gotten faster.
 
I hear ya! Timing is amazing in this business. HP issued Fortran as their desktop language starting with the desktop workstations in 1966, but sometime in the 1970's - late '70s maybe - they switched the default to their own high-powered BASIC interpreter, which I adopted by 1980. I then wrote parsers to convert code through the several HP BASIC flavors and DOS flavors, and eventually to ANSI C by 1989. HP's grand move to UNIX (HP-UX) had them touting Pascal for serious development, which I avoided thank goodness. Not so some of the early NDT / 3D scanner programmers who were moonlighting at our shop in the mid-'80s. I always wonder what became of that Pascal code. Looking in hindsight, Fortran would have given me a steady path through all of that to being able to compile for 64 bit systems today. I never had a problem composing several versions of my include (BASIC) or header (C) files for different O/S's, so I was always ready to adapt to the latest system. But, except for a few very simple UI's I did to get VB code running, I never wanted to get bogged down into that stuff. Today people pull up those C# modules floating around the Web and paste them together into 'Apps' sold by Apple etc. Fascinating stuff, but these guys are so far away from controlling much of what their code actually does under those GUI's - it gives me the chills.
 
Something I wrote for a friend that tried to inflict the 10 commandments of safe programming on me, sometime ~1990.

I have a short clip of someone doing the "To be or not to be" speech in actual Klingon. If I could only get a full length video of Hamlet in the original Klingon (as they say) then I'd be a very happy camper. BTW, I wonder what the Klingon version of coffee and pastries is.
 
Fortran, Yikes!
Assembler, still Yikes. I only had about 6 months worth of Assembler back at PolyTech,
Can barely read it now. I followed some classes on really odd ball operating systems and programming languages.


I started out with Turbo Pascal and Borland C,
Graduated on using Delphi & Clipper.
These days it's all MS Visual Studio 2010.

I rarely need to mess with operating systems ;)
 
I remember in school, 30+ years ago, one of the teachers in the last quarter of Senior year was giving his outlook of job prospects to the class. He stated that people that just wanted to be "coders" are going to have a rough time given all of the new automated tools coming out. He stated that people that understood the low-level stuff would be in demand. He turned to me and announced in front of the class "You'll always have a job."
 
People knowledgeable about Fortran and Cobol certainly were in big demand when the Millenium Bug was looming large. I was told there was big money to be made in it back then. I graduated in the summer of 2000 though, little too late ;)

All those hours with structural analysis and structural design have gone to waste on me too.
Apparently my department is using EXTREMELY rapid prototyping....all the time.... ;)
 
Cobol is still in strong demand because it has been heavily used by the big banks and they just have far too much legacy code to ever convert it all into something more modern (safely). Fortran I am not to sure about, its where I started as my academic education was physics which led to computational physics research and then signal processing research. Its only really scientists that use it these days mixed in with the occasional legacy monstrosity that nobody understands. Nearly all the low level code or underlying processing code I have been involved with over the past decade and a half has been in C/C++ or ADA, although I never got round to gaining any proficiency in ADA which is a shame as its having a renaissance of late.

I did waste some time once learning Lisp, but its another one of those good idea dead end languages.

Mr Flibble sounds like your department has got into XP, just make sure they don't use it as an excuse to just hack slash and dash as my friends have called it. Loads of places seem to have used these trends or fashions as an excuse to avoid doing a professional job.
 
Fortran, Yikes! Assembler, still Yikes. I only had about 6 months worth of Assembler back at PolyTech,
Can barely read it now. I followed some classes on really odd ball operating systems and programming languages. I started out with Turbo Pascal and Borland C, Graduated on using Delphi & Clipper. These days it's all MS Visual Studio 2010. I rarely need to mess with operating systems ;)

Interestingly enough, I never worked directly with assembler, however, I cut my teeth on the HP-65 and HP-67 programmable calculators, and even did a project for JPL tracking the Voyager craft with telemetry calculations using the HP-97 (sister to the HP-67). That program was highly, highly optimized because JPL insisted that it be one 224-step program only so they never had to reload it during the day's work. Long story short, the HP programmable calculator code was very similar to assembler, and if I were to program in assembler today, I'd just figure to spend a few hundred hours piecing together a bunch of subroutines, then call those from a relatively small main program. Piece o' cake as they say. The HP calculators beginning circa 1986 with the HP-28c used something called RPL (RPN Lisp) which in the eventual HP-48 series became probably the most extensive computer language I've seen short of C++, and C++ is a nightmare in my opinion.
 
I got through Calculus because the teacher was a firm believer in numerical solutions to problems. She allowed the students to write code for the programmable calculators of the day during the exam and submit the program for the answer. The TI SR-56 calculator that I used had no non-volatile memory, so she certainly confident that my code was written during the exam. I got an A for the course. "Just in time programming".

The Microchip PIC 12 and PIC 16 assembly language is very close to the SR-56 and TI-58 programmable calculators that I used.
 
If I remember right, the classic problems without exact solutions like the Traveling Salesman problem are computationally intense, but those calculators could walk through an reasonably-sized array of data points and interpolate something close to an optimum solution. For MRP, I had a similar problem - calculate the cost of ordering a warehoused item, with possible multiple orders with randomly-staggered demands, to get the lowest cost of warehousing quantities of the item over time versus the cost of generating and receiving the orders. The solution I came up with was wordy in the code sense, but easy to explain - line up all the demand quantities and requirement dates etc. in arrays, then split the arrays by 2, 3, 4 .... and calculate the costs of each group relative to the entire arrays. When each group is analyzed (like when split into 3 groups), move the last item in the top 2 groups into the previous group and recalc, then move the first item in the lower groups up and recalc there. All you have to save is the cost of each group for final comparisons. This push-pull method tested out to give the actual optimum cost on each problem I threw at it, and it was very fast - much faster than setting level codes at the beginning of the MRP run.

I also adapted this algorithm to calculating the minimum wiring necessary for jet plane cockpits, where I had to calculate cable lengths in 3D space. Not as hard as it might sound.

But much of that algorithm research came from a very strange place - something Fortran programmers would never have encountered. The HP-85 desktop computer I had in 1980 did not permit string arrays, even though strings on that O/S (HP BASIC O/S) were allocated and processed internally much like the 'C' language did. So to index the structured data file contents on floppy disks, dynamically as data was entered, I put 16-bit index pointers into the string memory (strings up to 32kb long) and had them point to records that were represented by an index of any number of fields and characters. dBase II for example would create huge index files - a big problem for floppy systems. The downside of my method was insertion time - because the index data wasn't written to the disk, but calculated by binary search comparisons at insertion time (load data record pointed to by current index pointer and load the indexed fields into a string), inserts could be slow depending on the cache. Numeric integer arrays could have replaced the strings, but the HP-85 O/S did not have short integers, so the packed string was it. And those packed strings led to other ideas.... In any case I sold hundreds of those file managers to floppy users, and even a few early hard-disk users, who either couldn't run dBase II or couldn't deal with the diskette limits due to index growth.
 
I write game software for gambling machines. Everything is tested vigoursly in-house, then it gets sent to a governmental homologation company to confirm that the machine is compliant to the Gambling Laws (and isn't a fire hazzard or anything).
 
There seems to be a correlation between computer professionals and Leica use. I think it's a "control thing", I like having tight control over the computer that I use and the camera that I use.

Dale- your algorithm reminded me of a story that Grace Hopper told me when I had her autograph my IBM Mark I manual. She used the "calculator" for one of her Navy courses for minimizing time to refuel several ships off of one tanker. She basically daisy-chained them, based on fuel capacity and speed of the pumps on each. Everyone else in the class refuled the ships one at a time.
 
There seems to be a correlation between computer professionals and Leica use. I think it's a "control thing", I like having tight control over the computer that I use and the camera that I use. Dale- your algorithm reminded me of a story that Grace Hopper told me when I had her autograph my IBM Mark I manual. She used the "calculator" for one of her Navy courses for minimizing time to refuel several ships off of one tanker. She basically daisy-chained them, based on fuel capacity and speed of the pumps on each. Everyone else in the class refuled the ships one at a time.

Yep - the correlation is apt, but not just as control - it's a tech thing, which might seem odd for a somewhat retro camera design. If I remember correctly, Leica M's are big in Japan, and Japanese consumers are very techie on average.

I am absolutely envious of you having met Grace Hopper. The thing you related didn't remind me so much of computers and calculations as it did those long hours standing on a road, manually fueling 100 or so of our Army trucks and jeeps from fuel pods we schlepped over from the nearest fuel depot. I suppose I could have enjoyed the fumes, but I don't remember - fumes do that to you.
 

Similar threads

Back
Top