Unconfigured Ad Widget

Collapse

Announcement

Collapse
No announcement yet.

I Do Declare!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • I Do Declare!

    Joe wrote: "... So what I've really done by turning to a declarative language is to stop writing code and instead use someone else's programming. I have thrown up the white flag and stated in no uncertain terms that someone else is better qualified to write my code than I am. ..." First, this reminds me about the debate over the use of high level languages versus assembler language. At one time, it may have been true that writing in assembler yielded the most efficient programs. However, if that was ever true at some point in the past, for the past few decades optimizing compilers have clearly been the better choices. Second, like it or not, almost all of us programmers must depend on "someone else's programming". Whether it's operating system code or compilers or source editors or communications protocols or library functions, there's always code written by someone else. That's a simple fact of life. Joe also wrote: "But Bill doesn't know how my database is laid out. Not really. Bill is like a really good contractor that you just hired. He knows the files and formats because he can look at the tables, but he doesn't know the contents of the data. He doesn't, for example, know that there are no records with a status of COMP and an active flag of Y, because that's a system design point that isn't encoded in the database. And while this knowledge may not affect the query he's working on, it's the kind of information that lets a programmer who knows the system write better code than one who doesn't." Okay, my expertise is compilers and not DBMS's, but the same principles are at play here. Optimizing compilers have been around for decades, but believe it or not, there are still advances being made in the craft of optimizing. (I don't think I want to call it a "science"!) For example, one current technique is to profile a program while it's running and make optimizations on the fly based on code paths taken. But that has been done in modern DBMS's for years now (as you probably already know, or should know). That is, a DBMS can track the behavior of the system to optimize its performance. This, domain specific knowledge of the DB analyst, while useful, is no longer absolutely necessary. You do make some valid points. The programmer or DB analyst still needs to do his job correctly. But the programmer or DB analyst can still take advantage of the work of others and count on the underlying software to take their designs to their full potential. Cheers! Hans

  • #2
    I Do Declare!

    Hans, I made sure to differentiate between the various levels of code dependence, so please don't lump them all together again and then make broad generalizations. OS code is different than compiler code, which is different than database code, which in turn is different from application code. And just as a great application coder is not the best person to write an optimizing compiler, a compiler writer has no business writing application code. My point is that certain types of tools, such as query optimizers and especially application code generators, cross the line in such a way that the tool writer is writing application code, something they are simply not equipped to do. And while statistical analysis of databases may eventually allow query optimizers to generate the best queries in the majority of situations, there is no way that application generators will ever be able to write the best code for any business application, because you need to know the problem domain before you write the code. And by definition, someone writing a code generator can't know the problem domain. So yes, we can take advantage of and build on the work of others, we always do. Technology advances, skills coalesce, and some things become commoditized. Application programmers need to take advantage of the latest tools in order to get their primary job done: codification of business rules to make businesses run more efficiently. I stopped writing my own operating systems in the early 80s, and I stopped using assembly language in the late 80s. I use a lot of SQL these days, and I use JSPs all the time. Both take advantage of someone else's code. But I only use them for what they're intended. And as long as such tools are used as an adjunct to business programming rather than as a replacement for it, they're great tools. Joe

    Comment


    • #3
      I Do Declare!

      Joe, You wrote your own operating systems? Can you elaborate on that? That's pretty interesting. What'd they run on? What did you write them in? Why did you stop?

      Comment


      • #4
        I Do Declare!

        I got my start in programming in the early 80's in a small office that sounds a lot like the one you started in. The owner went out and promised whatever it would take, and we had to make it happen. At that time, I was doing it mostly in interpretative BASIC, and as unbelievable as it might seem, I still have clients using that uncompiled software. If you have enough will and determination, you can make any language do impressive things. I'm certainly appreciative of tools like SQL, but I also don't like giving up the control over the processing to someone that I've never met. I feel the same way about purchasing large open-system software packages. It may be amazing how adaptable they are and how they can accommodate the needs of so many different businesses, but I don't think they are ever as good as a mid-size package developed for the specific industry with a little flexibility provided.

        Comment


        • #5
          I Do Declare!

          Halfway through my first semester in computer programming eons ago, I remember struggling simply because I could not understand how a computer could take a statement like a=a+1 and generate the magical bits to add 1 to some register assigned to variable 'a' deep inside a silicon wafer (or was it a vacuum tube? I forget ). After I exasperated one of my classmates with the droning "how", "why", and "when" questions for several weeks, he finally looked me square in the eye and declared "YOU DON'T CARE how or why or when! It JUST DOES!" I laugh at it now, but at the time, that was a leap of faith I was just not ready to make. (It wasn't until I completed Assembler I & II on an IBM-1130 that I was ready to give up my ideas of "control" and trust Fortran, Cobol, PL/I, etc. to do the right thing.) Tis a bit interesting take coming from someone who worked in a major capacity for one of the major ERP software vendors. Your customers buy your black boxes (aka ERP software) and trust what you (eh, your salespeople) say it will do. Based on my experience with various software packages, I would choose in-house developed software any day. Is that control-freakish? BTW, everyday life is FULL of black boxes. If I had to care how a microwave heats food, or how cars work, or how electricity flows through wires, well, crap, I should become a Luddite. Fortunately, I don't care to the point that I would have to know to the smallest detail how things work before I'd use them. Curiosity drives me to learn some details, and that is good for its own sake.

          Comment


          • #6
            I Do Declare!

            Your car is a black box, SQL is a black box, 4GL is a black box. Business management is a black box. It is about focusing on END-RESULTS. You do not have to plant organic vegetables in your backyard to eat it - get it from WAL-MART. Same is true with programming you can operate a business with ORACLE, SAP, JDE, MAPICS, COPICS, BPCS or any black box based package instead of developing it in-house with similar results.

            Comment


            • #7
              I Do Declare!

              The problem with SQL, is the young programmers are only being taught this one method of accessing the DB. I will qualify that by saying PC programmers. We don't have an i5 programing course in the schools around here. I have been programming for good long time, since 1977, and have written tons of systems code and applications code, a 4GL, thousands of commands, and C and VB on the PC too. From where I sit, SQL is a good tool for batch work like reports and update processes. But, on the i5 at least, it still goes through too many steps to be a replacement for what I do manually with direct access to the DB. On the PC using a local DB it is not really noticeable until the record quantities start to rise. On a small iSeries at a customer site I have this interactive search program that performs a partial license plate search over a few million vehicle entries and does it in about .02 seconds ( I added a Google like search response time to it just to show the users the AS/400's absolute dominance). By partial plate I mean a search for all vehicles with a license plate that contains XYZ somewhere in the plate. SQL can't even begin to match this. It hasn't even figured out what index to use in the amount of time my program has published the results. My son came to work with me after getting his PC programming degree and then I started to teach him what this programming world is really about. We were working on this PC program where it needed to go get a specific record for a key value sent to it. After he coded it I went through it. He used SQL, as he was taught, to select the record that matched the value. Oh the horror. I told him to use a seek (for non PC'ers its like a chain) instead, but he didn't know what that was. Honestly, on a local MS Access DB for a single record fetch the SQL was fine, but of course it couldn't keep up when I started streaming a few thousand of these requests every second. There is simply a lot of over head when using SQL to gain that level of simplification. The point is, the school didn't teach him any other way to get things done. SQL was the only way to access the DB. Young programmers have no idea there is another way, much less how to use it. That is becoming a lost skill. There are huge performance differences in using SQL in certain situations. The computers keep getting faster and the tool code like SQL keeps getting better, but this last mile of attempting to achieve what an experienced programmer can do is a long stretch. I openly admit there are some things tools can do better than I. For many of us experienced applications programmer on the i5, SQL just isn't there yet. That is not to say I disagree with you Hans. All of your points are dead on. I am sure SQL will continue to evolve and this industry like any other will see the expensive laborers replaced by less expensive solutions such as automation. In this case I refer to lessor experienced programmers using the automation brought on by SQL (but in some cases out sourcing and external packages). Business demands it. Hell, we demand it. Who wants to keep writing all this code, can't I just talk to my computer yet??? For SQL to achieve the same results I can do manually, its future level of sophistication will require the ability to remember its previous attempts, so it can learn and evolve its index selection and walking to match my knowledge about the application. Although it could short cut that process by letting me tell it what I know such as which index and how to move through it. This however defeats the goal of eliminating the NEED for my knowledge while accomplishing the business task. Ok, rant petering out now. Shane.

              Comment


              • #8
                I Do Declare!

                A little bit of a disconnnect here. Businesses start with one of those or the many other RPG source code packages and inevitably want to modify to support the business process. Although self-interested software vendors like Ellison of Oracle talk down customization, I've yet to see a modification that wasn't serious to the business. And they're the only people whose opinion matters. The least that matters is the self interest of software people. rd

                Comment


                • #9
                  I Do Declare!

                  I wrote Intel-based multi-tasking interrupt-driven communications machines. They sat between an IBM midrange (such as a System/38) and various other computer equipment. If you are familiar with the history of the midrange, IBM not only had its own internal character representation (EBCDIC) but it also had its own communications protocol (bisync). While faster than the traditional asynchronous communication because it didn't use start and stop bits, bisync was foreign to most other devices except, well except other IBM machines. So, in order to talk to a modem, or an async terminal, or a printer, or a cash register, or just about anything else in the world, you needed some way to transform the communications, and our little product did that. It allowed you to control up to 16 different asynchronous devices through a single bisynchronous connection line. You could configure each line with all sorts of parameters such as speed, number of bits, duplex and so on (including burst mode transmissions, in which you could transfer a whopping 32K of data at one time!). We had a boot loader which took up less than 8K and which sat on the line and downloaded the operating system from the midrange. The entire OS was under 32K, because we needed room for the buffers and 64K was as much as you could access without a lot of work. Our development box was a machine with an 8-inch Winchester hard disk drive -- that held a whopping 5MB of storage! And it was quite a successful venture and the company that marketed the product continues to this day. However, I then got introduced to the world of application programming through the development of a real-time commodity trading system, and I was hooked. Seeing a computer actually make someone's job easier is a really great feeling. I fell in love with the midrange during that time and now I'm doing everything I can to preserve it, despite attempts to turn it into an SQL back end or a multi-partition Java processor. And interestingly enough, it's this background that has actually allowed me to have faith in things over the years -- but only as they deserved it. The first microprocessors (and especially the very fincky USART chips) often had timing mistakes that you could only find with an oscilloscope. In the early days, these could be crucial and yes, I have debugged computer errors using an O-scope (one in particular had to do with an interrupt on a TTD in a Bisynchronous communication). Eventually, though, the chips got good enough that I didn't need to do that. Next were assemblers. Assemblers, with the exception of macro assemblers, were typically very good, because they did very little. Assemblers were primarily concerned with translating assembly language into machine code, pretty much a one-for-one. They did that well. Macro assemblers often got confused, though, when the guy who designed the macro expansion didn't take into account something we programmers would do. Compilers were the next level. Early C compilers were rife with bugs. Man, you wouldn't believe the incorrect MOV instructions generated back in those days. Often you had to disassemble the object code in order to find the error. Which you couldn't do if you didn't understand how the machine code worked in the first place. Then try to take a runtime written for a dedicated machine and make it work in an interrupt-driven environment. I'll tell you some stories about Pascal MT-Plus someday that'll curl your hair . But eventually compilers got better and now I trust them almost 100% of the time. I then spent a lot of time writing database code, translating between the EBCDIC-based Series/1 and ASCII-based Ontel word processors. In so doing, I learned how indices work and why a SETLL is faster than a CHAIN. Because back then databases got corrupted and you often had to try and pull the data out with your teeth. My career continues along those lines. I know how Java works internally. I understand how a JSP generates Java source which is in turn compiled to bytecode which is later converted to machine code by the JIT compiler. The only part of that process which is a black box to me is the JIT compiler, because that level of optimization is pretty esoteric and highly mathematical in nature. At some point, a man's got to know his limitations, and I leave it to the compiler folks to get that part right . But I guess my VERY long-winded point is that I may trust something, but only because I understand how it works. In a pinch, I can usually go in and debug just about any code ever written. Even generated code can be sniffed through. However, things like query optimizers start to take that away and make us rely on the code of others, and as far as I've seen from the state of computer programming today, the median level of programming skills is not going up. As programming drifts from imperative to declarative, people are losing sight of how the machine works and eventually we'll be a bunch of push-button end users cobbling together bits of code written by people who never even saw a real end user. Joe

                  Comment


                  • #10
                    I Do Declare!

                    Everyday life may be full of black boxes, but then again I'm not an electrician or a mechanic, I am a programmer. I don't charge money from people in order to fix their microwaves, I charge money to write software. Of course, if an electrician doesn't know how electricity flows through wires, he is going to remove himself from the labor pool quickly enough. Unfortunately there isn't quite as strong a natural selection process in the programming profession. Joe

                    Comment


                    • #11
                      I Do Declare!

                      Semi-custom code is the end-all and be-all of programming, in my book. You start with a bunch of code that's already pre-written and then extend it to fit your own business requirements. And that's what those wonderful, complex ERP systems written in RPG were all about, especially my beloved BPCS. That's way different than a shrink-wrap system with a bunch of switches (SAP is more like this model, and of course most commercial PC programs are this way because you don't get the source). But even if you do get the source for a PC package, you have to have someone who understands the business, and someone with a degree in CS and a Microsoft certification isn't necessarily going to have the business knowledge to be able to modify the system. Now it may be that we're in the waning days of semi-custom software. We might look at code written now the same way we look at the silversmiths of the 18th century: master craftsmen whose work eventually fell prey to the economies of mass production. However, I think that there are certain inherent characteristics of a really good programmer/analyst that make them irreplaceable in a true custom software environment. The folks who just want to "get the job done" without having a clue how it's being done under the covers really don't give a crap about programming. They see programming as a paycheck, and not as a craft. They want their job to be ever easier and faster, and they'll fall in with the latest programming fad (Extreme Programming, anyone?). And those are the kinds of jobs that are easy to outsource. However, the guy who knows the database, understands the machine, takes the time to learn the user's needs, and then comes up with a brand new way of doing things, THAT'S the guy who is going to give his company an edge. And in the end, that's what the computer is really about. In my opinion, anyway . Joe

                      Comment


                      • #12
                        I Do Declare!

                        I agree 100%, Ralph. You've actually designed working applications for end users from their requirements, you know that while 80% of the requirements are common and the code can be cookie cutter, it's that 20% of custom code that can make or break a company. It's all about the details, and anybody who thinks that shrink-wrap software can compete with a really good business analyst probably has never met a really good business analyst . Joe

                        Comment


                        • #13
                          I Do Declare!

                          Joe, I was intrigued by all this talk a few years ago that software was a commodity, IT didn't matter, and vendors telling their customers that they really needed to change to the way the best practices shrink wrap software worked. That's assuming the commodity software even made an attempt to do it. I did some searching and found that inviable was viable after all. Oracle, Peoplesoft, whatever, there were discussions on the issues customization had on upgrades. Competitive advantage for their customers was of no concern to these software vendors, all they cared about was eliminating impediments to selling new versions of their software. They cared about their sales first, not their customers sales first. That talk has died down now. The business world never bought it. Sure, they'll try to keep costs down as always, but they'll do what it takes to make that deal and deliver on it. And they do what it takes with custom software, because even they don't know what it will take until they made that deal. IT does matter, and customized iseries RPG software packages have been one of the most powerful ways for a business to make it matter. You should know, you had a customer list of some of the largest corporations in the world whose only black box was the AS/400. rd

                          Comment


                          • #14
                            I Do Declare!

                            A good business analyst is results-oriented not task oriented that's Systems Analysis 101. The 20 % customization is correct, but rule number 1 is "NEVER ALTER THE BASE CODES ITSELF IF POSSIBLE" to avoid future costly software upgrades. This is where you hire consultants/contractors to do the programming around the package - the TAXIGRAMMER or EXPERTSIVE people. In the AS400/i5 world you copy and edit the base code and put it on top of the library list, document it well and bingo you go - This practice works for big, medium and small business. THE MORAL LESSON is you hire experts to do the right job and focused on your core business competency to survive this ever fast changing world called global economy. In reality this is Business Process Outsourcing... personally I don't like, and IT is very much vulnerable because the good BUSINESS ANALYST DECIDED SO...

                            Comment


                            • #15
                              I Do Declare!

                              Shane said: “The problem with SQL, is the young programmers are only being taught this one method of accessing the DB. I will qualify that by saying PC programmers. We don't have an i5 programming course in the schools around here.” Conversely, the GREAT thing with SQL happens in the hands of us who were brought up with native I/O. SQL is a a great tool for the database, but IMO, it is even better at teaching one how to design their databases to more effective for the business. We can see another world--one where we say, “Hmm--I can see native I/O working well in that area, and yes, SQL will work well for that one.” The young guys are driving blind in that area---they can fly over a forest quite well, but ask them to give you a tour on the ground in a 4-wheeler, and they get bogged down in the first puddle. Lately I’ve gravitated towards Joe’s benchmarks when designing database access logic, and they work well I think (his numbers of native I/O vs. SQL). Joe has made the point, quite strongly, that those who become bigots on either method (native vs. SQL) are doing themselves a disservice. (Joe, that’s my general take on what you have said---I don’t mean to put words in your mouth). I liken it to carpenters and the nail gun. The stubborn old timers use hammers only, and the stubborn new guys use nail guns; BUT the best ones are the guys who know that the nail gun works in a lot of cases, but there are times when nothing will suffice but his old trusty hammer. [Off topic: When my home was built, using trusses, I sat and watched a carpenter construct a pyramid frame over the entry area FROM NOTHING (the trusses left the area formless). Using nothing but 2x4’s, a circular saw, measuring tape, a pencil, his eyes, nails, and a HAMMER, he sat up there and created this mass of 2x4's going in every direction. I have looked up there many times since looking for some hint of it being out of square, but I cannot—He did a perfect job.] Anyway, it’s a poor analogy, but I like the story. Joe said: “…the guy who knows the database, understands the machine, takes the time to learn the user's needs, and then comes up with a brand new way of doing things, THAT'S the guy who is going to give his company an edge. And in the end, that's what the computer is really about.” What a statement. IMO that is great tragedy of outsourcing---not only in job losses, but also, it shines light on the abject failure of US companies to understand the true value of people. I must say, I’ve only worked with a handful people having all of these attributes (NO, WAIT. I’ve only known 2 of them), but it is wonderful to be around people like that. There are those who are true GURUS in the programming world, and there are those who truly gifted at communicating with users and understanding their needs. But, IMHO, it is rare to find someone with talent in both areas. Further, for the technicians who work under such an individual, those attributes actually rub off, making everyone better at their jobs. (Everything rolls downhill you know). If every company would strive to find JUST ONE of these individuals and do what it took to keep them as a trusted, happy, and motivated partner, there would never be a need to outsource a single IT job.

                              Comment

                              Working...
                              X