• Fujitsu Cobol & Sqlite

    From JM@1:2320/100 to comp.lang.cobol on Sat Oct 21 04:22:26 2017
    From Newsgroup: comp.lang.cobol


    Anyone here who can convert this MF code to PowerCobol?
    Apparently pointers are treated differently.

    https://stackoverflow.com/questions/39957668/how-to-start-using-sqlite-from-cobol.


    Thank you for any help
    Jm

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From pete dashwood@1:2320/100 to comp.lang.cobol on Mon Oct 30 02:25:59 2017
    From Newsgroup: comp.lang.cobol

    On 22/10/2017 12:22 AM, JM wrote:

    Anyone here who can convert this MF code to PowerCobol?
    Apparently pointers are treated differently.

    Yes, they are not required for Fujitsu COBOL.

    https://stackoverflow.com/questions/39957668/how-to-start-using-sqlite-from-cobol.


    Thank you for any help
    Jm



    You don't need any of the pointers or C language routines if you use
    Fujitsu.

    Fujitsu use a Pre-compiler pass so that the SQL statements can be
    embedded directly in your program (prefixed with "EXEC SQL" and
    "END-EXEC") That includes the CONNECT to whatever DB you want.

    Why are you targeting PowerCOBOL? Are you wanting to make an obsolete primitive GUI interaction? You don't NEED PowerCOBOL for this, though
    you CAN use it if you want to.

    For actions against the database you could probably use a component
    written in NetCOBOL for Windows. (You can invoke it from PowerCOBOL or a
    Web Page or the desktop if you write it in the manner recommended by PRIMA.)

    If you check out: https://primacomputing.co.nz/PRIMAMetro/RDBandSQL.aspx
    you will find information that has bearing on this exercise. There are
    three pages and a portal page, also a link to sample Fujitsu NetCOBOL
    for Windows code that implements a general purpose Fujitsu COBOL
    database module. This could be modified to meet your requirement in a
    very short time. (I'm not giving you the link; you need to read the
    material and then you'll find it...)

    Presumably, if you are targeting PowerCOBOL, you have a set of Fujitsu manuals? You should read the chapter on using SQL; it is pretty simple
    and straightforward.

    If you do the homework and decide to have a go at the approaches
    described, I'll help you convert the code if you get stuck.

    Pete.
    --
    I used to write COBOL; now I can do anything...

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From JM@1:2320/100 to comp.lang.cobol on Mon Oct 30 08:50:54 2017
    From Newsgroup: comp.lang.cobol

    domingo, 29 de Outubro de 2017 |as 13:26:06 UTC, pete dashwood escreveu:
    On 22/10/2017 12:22 AM, JM wrote:

    Anyone here who can convert this MF code to PowerCobol?
    Apparently pointers are treated differently.

    Yes, they are not required for Fujitsu COBOL.

    https://stackoverflow.com/questions/39957668/how-to-start-using-sqlite-from-cobol.


    Thank you for any help
    Jm



    You don't need any of the pointers or C language routines if you use Fujitsu.

    Fujitsu use a Pre-compiler pass so that the SQL statements can be
    embedded directly in your program (prefixed with "EXEC SQL" and
    "END-EXEC") That includes the CONNECT to whatever DB you want.

    Why are you targeting PowerCOBOL? Are you wanting to make an obsolete primitive GUI interaction? You don't NEED PowerCOBOL for this, though
    you CAN use it if you want to.

    For actions against the database you could probably use a component
    written in NetCOBOL for Windows. (You can invoke it from PowerCOBOL or a
    Web Page or the desktop if you write it in the manner recommended by PRIMA.)

    If you check out: https://primacomputing.co.nz/PRIMAMetro/RDBandSQL.aspx
    you will find information that has bearing on this exercise. There are
    three pages and a portal page, also a link to sample Fujitsu NetCOBOL
    for Windows code that implements a general purpose Fujitsu COBOL
    database module. This could be modified to meet your requirement in a
    very short time. (I'm not giving you the link; you need to read the
    material and then you'll find it...)

    Presumably, if you are targeting PowerCOBOL, you have a set of Fujitsu manuals? You should read the chapter on using SQL; it is pretty simple
    and straightforward.

    If you do the homework and decide to have a go at the approaches
    described, I'll help you convert the code if you get stuck.

    Pete.
    --
    I used to write COBOL; now I can do anything...
    Thanks for the answer!
    Well ... I know the instructions "EXEC-SQL ...", i already use to access Mysql. In this case, I wanted to have native access without using ODBC and very dynamic use for data processing locally and interface with third parties. Powercobol absolete? I know, but it's what I have :(
    Regards,

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From pete dashwood@1:2320/100 to comp.lang.cobol on Tue Oct 31 15:36:11 2017
    From Newsgroup: comp.lang.cobol

    On 31/10/2017 4:50 AM, JM wrote:
    domingo, 29 de Outubro de 2017 |as 13:26:06 UTC, pete dashwood escreveu:
    On 22/10/2017 12:22 AM, JM wrote:

    Anyone here who can convert this MF code to PowerCobol?
    Apparently pointers are treated differently.

    Yes, they are not required for Fujitsu COBOL.

    https://stackoverflow.com/questions/39957668/how-to-start-using-sqlite-from-cobol.


    Thank you for any help
    Jm



    You don't need any of the pointers or C language routines if you use
    Fujitsu.

    Fujitsu use a Pre-compiler pass so that the SQL statements can be
    embedded directly in your program (prefixed with "EXEC SQL" and
    "END-EXEC") That includes the CONNECT to whatever DB you want.

    Why are you targeting PowerCOBOL? Are you wanting to make an obsolete
    primitive GUI interaction? You don't NEED PowerCOBOL for this, though
    you CAN use it if you want to.

    For actions against the database you could probably use a component
    written in NetCOBOL for Windows. (You can invoke it from PowerCOBOL or a
    Web Page or the desktop if you write it in the manner recommended by PRIMA.) >>
    If you check out: https://primacomputing.co.nz/PRIMAMetro/RDBandSQL.aspx
    you will find information that has bearing on this exercise. There are
    three pages and a portal page, also a link to sample Fujitsu NetCOBOL
    for Windows code that implements a general purpose Fujitsu COBOL
    database module. This could be modified to meet your requirement in a
    very short time. (I'm not giving you the link; you need to read the
    material and then you'll find it...)

    Presumably, if you are targeting PowerCOBOL, you have a set of Fujitsu
    manuals? You should read the chapter on using SQL; it is pretty simple
    and straightforward.

    If you do the homework and decide to have a go at the approaches
    described, I'll help you convert the code if you get stuck.

    Pete.
    --
    I used to write COBOL; now I can do anything...


    Thanks for the answer!

    Well ... I know the instructions "EXEC-SQL ...", i already use to access
    Mysql.
    In this case, I wanted to have native access without using ODBC and very
    dynamic use for data processing locally and interface with third parties.

    Powercobol absolete? I know, but it's what I have :(
    Regards,

    Sure, I understand.

    PowerCOBOL is actually a very good tool, but it has passed its expiry
    date and there is no migration path being offered to get out of it.
    PRIMA are offering that path and we have just released a Toolkit to help people bring PowerCOBOL into .Net. (The Toolkit generates Windows forms
    from the PowerCOBOL Forms and salvages all the scriptlets to become .Net code-behind for the Windows Forms.) Details here: https://primacomputing.co.nz./PRIMAMetro/pwrCOBMigration.aspx

    Given the requirements you stated above "without using ODBC and very
    dynamic use for data processing locally and interface with third
    parties" what you are looking for is a standard general purpose Database Access Module. (We have a tool that can generate these in COBOL or C#). Obviously, as you are looking for it to use with a specific DB, you will
    need a module tailored for that DB. There are a couple of flies in your ointment though.

    If you use Fujitsu you MUST use ODBC for access to standard RDB systems,
    when using NetCOBOL. (There is an alternative which involves using
    their ADO component, and that is available through PowerCOBOL).

    The real decision you need to make is whether you will go with the DAL approach explained in the pages referenced in my previous mail, or
    whether you just want to learn how to embed SQL into Fujitsu COBOL.

    If you go with DAL you will write ONE set of code. It does all possible actions against a Database: Get Random, Get Sequential, Get Skip
    Sequential, Insert, Update, and Delete. The actions are written as
    methods in a Fujitsu OO COBOL Class using NetCOBOL for Windows (or, you
    can use the CIL generating compiler if you have one...). Because it is
    an OO .dll you can use the Fujitsu *COM interface with it (it is very
    much easier to write COM components with Fujitsu than it is with Micro
    Focus) and then it is available through the standard COM interface to
    any third party, Web Page, desktop, or any language that implements the
    COM interface.

    When the next Database comes along you clone the code above. The logic
    doesn't change; only the interface (The COBOL record layout and the Host Variables)

    Like I said, we have tools that generate the code required, but if you
    want to have a go at this, I'll generate the module for you. (You will
    need to give me the current database, OR, if the database was converted
    from original ISAM files, the FD/01 and SELECT COPY books for the COBOL
    ISAM files.)

    If you just want to know how to insert ESQL into Fujitsu COBOL then read Chapter 8 of the NetCOBOL Language Reference.

    Pete.
    --
    I used to write COBOL; now I can do anything...

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From docdwarf@1:2320/100 to comp.lang.cobol on Tue Oct 31 11:35:30 2017
    From Newsgroup: comp.lang.cobol

    In article <f5q5suFftdqU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 31/10/2017 4:50 AM, JM wrote:

    [snip]

    If you go with DAL you will write ONE set of code. It does all possible >actions against a Database: Get Random, Get Sequential, Get Skip
    Sequential, Insert, Update, and Delete.

    By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years
    back for a client who was so afraid of cleaning up dead data that they
    were exceeding the 4Gb file-size limit.

    [snip]

    When the next Database comes along you clone the code above. The logic >doesn't change; only the interface (The COBOL record layout and the Host >Variables)

    Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
    change is a system recompile (and resulting end-to-end testing to satisfy Audit requirements) be something a Diligent Manager might, just possibly, insist upon?

    DD

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From pete dashwood@1:2320/100 to comp.lang.cobol on Wed Nov 1 13:16:17 2017
    From Newsgroup: comp.lang.cobol

    On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
    In article <f5q5suFftdqU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 31/10/2017 4:50 AM, JM wrote:

    [snip]

    If you go with DAL you will write ONE set of code. It does all possible
    actions against a Database: Get Random, Get Sequential, Get Skip
    Sequential, Insert, Update, and Delete.

    By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years back for a client who was so afraid of cleaning up dead data that they
    were exceeding the 4Gb file-size limit.

    Yep, I can well imagine that it did sound like that. I first cottoned on
    to the idea of separation between data and business logic when I
    developed a VSAM database back in the late 1970s for a large
    multi-national in England. Long before there were RDBMS, we were writing "access modules" using common logic, in much the way described.

    [snip]

    When the next Database comes along you clone the code above. The logic
    doesn't change; only the interface (The COBOL record layout and the Host
    Variables)

    Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
    change is a system recompile (and resulting end-to-end testing to satisfy Audit requirements) be something a Diligent Manager might, just possibly, insist upon?

    No, that's the whole point. The Data Access Layer (comprised of modules
    like the one we are discussing) is only ever accessed through an
    interface. It is SEPARATED from the actual business logic. The whole
    system does NOT have to be re-compiled; ONLY the DAL object affected,
    and regression testing is only required for programs that invoke THAT
    DAL object.

    The concept in play here is ENCAPSULATION. I tried to cover it in the
    pages on the web site, maybe it isn't as clear as I would like it to
    be...I'll review it.

    https://primacomputing.co.nz/PRIMAMetro/RDBandSQL3.aspx

    The main difference between the DAL approach and using ESQL directly in
    the applications, is that for the ESQL solution, you have duplicated SQL statements scattered throughout the code of your application programs.
    This DOES require extensive regression testing and recompilation across
    the entire system. The DAL approach does not.

    Pete.
    --
    I used to write COBOL; now I can do anything...

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From docdwarf@1:2320/100 to comp.lang.cobol on Wed Nov 1 12:08:56 2017
    From Newsgroup: comp.lang.cobol

    In article <f5si2nF2japU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
    In article <f5q5suFftdqU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 31/10/2017 4:50 AM, JM wrote:

    [snip]

    If you go with DAL you will write ONE set of code. It does all possible
    actions against a Database: Get Random, Get Sequential, Get Skip
    Sequential, Insert, Update, and Delete.

    By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years
    back for a client who was so afraid of cleaning up dead data that they
    were exceeding the 4Gb file-size limit.

    Yep, I can well imagine that it did sound like that. I first cottoned on
    to the idea of separation between data and business logic when I
    developed a VSAM database back in the late 1970s for a large
    multi-national in England. Long before there were RDBMS, we were writing >"access modules" using common logic, in much the way described.

    [snip]

    When the next Database comes along you clone the code above. The logic
    doesn't change; only the interface (The COBOL record layout and the Host >>> Variables)

    Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
    change is a system recompile (and resulting end-to-end testing to satisfy
    Audit requirements) be something a Diligent Manager might, just possibly,
    insist upon?

    No, that's the whole point. The Data Access Layer (comprised of modules
    like the one we are discussing) is only ever accessed through an
    interface. It is SEPARATED from the actual business logic. The whole
    system does NOT have to be re-compiled; ONLY the DAL object affected,
    and regression testing is only required for programs that invoke THAT
    DAL object.

    So... when the current database has DDL which generates (for a client
    number, last order date, last order value and last order ship-to US postal code)

    05 CLINUM PIC 9(6).
    05 LSTORDDT PIC 9(6).
    05 LSTORDVL PIC 9(8)V9(2).
    05 LSTORDZP PIC 9(9).

    ... and the company starts doing business internationally so a
    Corner-Office Idiot dictates the last order's currency symbol be included
    and the postal code be changed for alphanumerics, resulting in:

    05 CLINUM PIC 9(6).
    05 LSTORDDT PIC 9(6).
    05 LSTORDVL PIC 9(8)V9(2).
    05 LSTORDCC PIC X(3).
    05 LSTORDPC PIC X(9).

    ... then you're saying that all one has to do is code the Data Access
    Layer with the new DDL and the simple logic to determine whether the
    program accessing it is Old Code or New Code?

    DD

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From pete dashwood@1:2320/100 to comp.lang.cobol on Thu Nov 2 11:34:00 2017
    From Newsgroup: comp.lang.cobol

    On 2/11/2017 1:08 AM, docdwarf@panix.com wrote:
    In article <f5si2nF2japU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
    In article <f5q5suFftdqU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 31/10/2017 4:50 AM, JM wrote:

    [snip]

    If you go with DAL you will write ONE set of code. It does all possible >>>> actions against a Database: Get Random, Get Sequential, Get Skip
    Sequential, Insert, Update, and Delete.

    By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years >>> back for a client who was so afraid of cleaning up dead data that they
    were exceeding the 4Gb file-size limit.

    Yep, I can well imagine that it did sound like that. I first cottoned on
    to the idea of separation between data and business logic when I
    developed a VSAM database back in the late 1970s for a large
    multi-national in England. Long before there were RDBMS, we were writing
    "access modules" using common logic, in much the way described.

    [snip]

    When the next Database comes along you clone the code above. The logic >>>> doesn't change; only the interface (The COBOL record layout and the Host >>>> Variables)

    Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
    change is a system recompile (and resulting end-to-end testing to satisfy >>> Audit requirements) be something a Diligent Manager might, just possibly, >>> insist upon?

    No, that's the whole point. The Data Access Layer (comprised of modules
    like the one we are discussing) is only ever accessed through an
    interface. It is SEPARATED from the actual business logic. The whole
    system does NOT have to be re-compiled; ONLY the DAL object affected,
    and regression testing is only required for programs that invoke THAT
    DAL object.

    So... when the current database has DDL which generates (for a client
    number, last order date, last order value and last order ship-to US postal code)

    05 CLINUM PIC 9(6).
    05 LSTORDDT PIC 9(6).
    05 LSTORDVL PIC 9(8)V9(2).
    05 LSTORDZP PIC 9(9).

    ... and the company starts doing business internationally so a
    Corner-Office Idiot dictates the last order's currency symbol be included
    and the postal code be changed for alphanumerics, resulting in:

    05 CLINUM PIC 9(6).
    05 LSTORDDT PIC 9(6).
    05 LSTORDVL PIC 9(8)V9(2).
    05 LSTORDCC PIC X(3).
    05 LSTORDPC PIC X(9).

    ... then you're saying that all one has to do is code the Data Access
    Layer with the new DDL and the simple logic to determine whether the
    program accessing it is Old Code or New Code?

    DD

    It would certainly be possible to add a flag in the interface saying
    WHICH COBOL record definition was required (the old one or the new one)
    and extend the DAL object to deal with either. But the general idea is
    NOT to change the DAL object's logic. (For PRIMA, we generate these
    objects anyway, so the code is normally consistent.)

    What we might expect to happen here is that the COBOL COPY Book defining
    the record layout would be changed to the NEW version. This record
    layout is what the applications "see" in the interface to the DAL
    object. It is shared through LINKAGE between the DAL object and the application.

    The Host Variables for the new definition would be DECLGENed and added
    to the DAL object. (Again, we have tools that do this automatically when
    the DAL object is generated, but you can certainly do it by hand and I
    have done...You can see typical COBOL code that loads and unloads the
    HVs in a DAL object by viewing: https://primacomputing.co.nz/PRIMAMetro/demosandtutorials.aspx Click on
    "video 7" - it is about 20 minutes. I am slowly working through these
    videos to make them shorter and more pertinent to the latest Toolset
    releases, but the principles involved are still the same.) The DAL
    object gets re-compiled. Note that it's LOGIC has not changed; only the record(s) (the record may be REDEFINEd as in standard COBOL...) it is
    required to construct/deconstruct, and the HVs required to do that.

    The applications which CALL/INVOKE this DAL object/module get recompiled
    so the new record layout is available, and, of course, the logic to deal
    with the new/changed fields is implemented into the application. This
    should be pretty minimal, as only a few fields have changed. (A system
    Data Dictionary can be useful for finding which applications use this
    DAL, but it really depends on the size of the system. Our experience
    shows that it is much quicker and easier to find where a DAL object is
    used than it is to scan for every ESQL statement that may use one of the
    DB columns that was changed...)

    These applications (the ones affected by the change) get regression tested.

    There is NO duplicated SQL in the system and the Applications deal only
    with the record layout, without worrying about the mechanics of HOW it
    is populated.

    This concept of using a separated DAL layer is useful in many languages,
    but for COBOL, which is innately "record oriented", it fits extremely well.

    Pete.

    --
    I used to write COBOL; now I can do anything...

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From JM@1:2320/100 to comp.lang.cobol on Thu Nov 2 03:10:52 2017
    From Newsgroup: comp.lang.cobol

    quarta-feira, 1 de Novembro de 2017 |as 22:34:05 UTC, pete dashwood escreveu:
    On 2/11/2017 1:08 AM, docdwarf@panix.com wrote:
    In article <f5si2nF2japU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
    In article <f5q5suFftdqU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 31/10/2017 4:50 AM, JM wrote:

    [snip]

    If you go with DAL you will write ONE set of code. It does all possible >>>> actions against a Database: Get Random, Get Sequential, Get Skip
    Sequential, Insert, Update, and Delete.

    By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years >>> back for a client who was so afraid of cleaning up dead data that they >>> were exceeding the 4Gb file-size limit.

    Yep, I can well imagine that it did sound like that. I first cottoned on >> to the idea of separation between data and business logic when I
    developed a VSAM database back in the late 1970s for a large
    multi-national in England. Long before there were RDBMS, we were writing >> "access modules" using common logic, in much the way described.

    [snip]

    When the next Database comes along you clone the code above. The logic >>>> doesn't change; only the interface (The COBOL record layout and the Host >>>> Variables)

    Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
    change is a system recompile (and resulting end-to-end testing to satisfy >>> Audit requirements) be something a Diligent Manager might, just possibly, >>> insist upon?

    No, that's the whole point. The Data Access Layer (comprised of modules
    like the one we are discussing) is only ever accessed through an
    interface. It is SEPARATED from the actual business logic. The whole
    system does NOT have to be re-compiled; ONLY the DAL object affected,
    and regression testing is only required for programs that invoke THAT
    DAL object.

    So... when the current database has DDL which generates (for a client number, last order date, last order value and last order ship-to US postal code)

    05 CLINUM PIC 9(6).
    05 LSTORDDT PIC 9(6).
    05 LSTORDVL PIC 9(8)V9(2).
    05 LSTORDZP PIC 9(9).

    ... and the company starts doing business internationally so a Corner-Office Idiot dictates the last order's currency symbol be included and the postal code be changed for alphanumerics, resulting in:

    05 CLINUM PIC 9(6).
    05 LSTORDDT PIC 9(6).
    05 LSTORDVL PIC 9(8)V9(2).
    05 LSTORDCC PIC X(3).
    05 LSTORDPC PIC X(9).

    ... then you're saying that all one has to do is code the Data Access
    Layer with the new DDL and the simple logic to determine whether the program accessing it is Old Code or New Code?

    DD

    It would certainly be possible to add a flag in the interface saying
    WHICH COBOL record definition was required (the old one or the new one)
    and extend the DAL object to deal with either. But the general idea is
    NOT to change the DAL object's logic. (For PRIMA, we generate these
    objects anyway, so the code is normally consistent.)

    What we might expect to happen here is that the COBOL COPY Book defining
    the record layout would be changed to the NEW version. This record
    layout is what the applications "see" in the interface to the DAL
    object. It is shared through LINKAGE between the DAL object and the application.

    The Host Variables for the new definition would be DECLGENed and added
    to the DAL object. (Again, we have tools that do this automatically when
    the DAL object is generated, but you can certainly do it by hand and I
    have done...You can see typical COBOL code that loads and unloads the
    HVs in a DAL object by viewing: https://primacomputing.co.nz/PRIMAMetro/demosandtutorials.aspx Click on "video 7" - it is about 20 minutes. I am slowly working through these
    videos to make them shorter and more pertinent to the latest Toolset releases, but the principles involved are still the same.) The DAL
    object gets re-compiled. Note that it's LOGIC has not changed; only the record(s) (the record may be REDEFINEd as in standard COBOL...) it is required to construct/deconstruct, and the HVs required to do that.

    The applications which CALL/INVOKE this DAL object/module get recompiled
    so the new record layout is available, and, of course, the logic to deal with the new/changed fields is implemented into the application. This
    should be pretty minimal, as only a few fields have changed. (A system
    Data Dictionary can be useful for finding which applications use this
    DAL, but it really depends on the size of the system. Our experience
    shows that it is much quicker and easier to find where a DAL object is
    used than it is to scan for every ESQL statement that may use one of the
    DB columns that was changed...)

    These applications (the ones affected by the change) get regression tested.

    There is NO duplicated SQL in the system and the Applications deal only
    with the record layout, without worrying about the mechanics of HOW it
    is populated.

    This concept of using a separated DAL layer is useful in many languages,
    but for COBOL, which is innately "record oriented", it fits extremely well.

    Pete.

    --
    I used to write COBOL; now I can do anything...
    For me, new developments will never be made in Cobol. Finished.
    I started with DG Icobol, then Microfocus, as the runtime costs were high, the next compiler was Fujitsu Cobol. Because Fujitsu killed Powercobol by not finding a way to convert applications to more modern platforms, we concluded that conversion costs manually or with third-party tools would always be very high because of the size of the application. New developments are being made in
    Javascript. For now we are very satisfied with the progress.

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From docdwarf@1:2320/100 to comp.lang.cobol on Thu Nov 2 16:50:28 2017
    From Newsgroup: comp.lang.cobol

    In article <f5v0erFjvgkU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 2/11/2017 1:08 AM, docdwarf@panix.com wrote:
    In article <f5si2nF2japU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
    In article <f5q5suFftdqU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:

    [snip]

    When the next Database comes along you clone the code above. The logic >>>>> doesn't change; only the interface (The COBOL record layout and the Host >>>>> Variables)

    Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
    change is a system recompile (and resulting end-to-end testing to satisfy >>>> Audit requirements) be something a Diligent Manager might, just possibly, >>>> insist upon?

    No, that's the whole point. The Data Access Layer (comprised of modules
    like the one we are discussing) is only ever accessed through an
    interface. It is SEPARATED from the actual business logic. The whole
    system does NOT have to be re-compiled; ONLY the DAL object affected,
    and regression testing is only required for programs that invoke THAT
    DAL object.

    [snip of example, please research]

    ... then you're saying that all one has to do is code the Data Access
    Layer with the new DDL and the simple logic to determine whether the
    program accessing it is Old Code or New Code?

    [snip]

    What we might expect to happen here is that the COBOL COPY Book defining
    the record layout would be changed to the NEW version.

    '... record layout would be changed' == code is recompiled == code must be tested. For critical modules the preferred method is end-to-end.

    DD

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From pete dashwood@1:2320/100 to comp.lang.cobol on Fri Nov 3 13:01:41 2017
    From Newsgroup: comp.lang.cobol

    On 2/11/2017 11:10 PM, JM wrote:
    "For me, new developments will never be made in Cobol. Finished.
    I started with DG Icobol, then Microfocus, as the runtime costs were
    high, the next compiler was Fujitsu Cobol. Because Fujitsu killed
    Powercobol by not finding a way to convert applications to more modern platforms, we concluded that conversion costs manually or with
    third-party tools would always be very high because of the size of the application. New developments are being made in Javascript. For now we
    are very satisfied with the progress."

    I don't think it is fair to say Fujitsu killed PowerCOBOL... they were
    over a barrel because it was written originally by contractors who moved
    on. They COULDN'T do much about it without a huge investment which they decided could not be justified.

    There IS a way to convert/modernize PowerCOBOL applications and we do
    not charge an arm and a leg for it. BUT, it has taken us a couple of
    years to develop, so you can fairly argue that we may be too late with
    this one. Nevertheless, some people are currently evaluating it and it
    DOES allow you to salvage all your business rules (in scriptlets) and
    upgrade your existing GUI screens.

    Javascript (especially with XAML) is not a bad option, and I can
    understand people opting for it.

    It would not be hard for us to convert PowerCOBOL to XAML, in a similar
    way to converting to C# Win Forms.

    Our tools analyse the PowerCOBOL and produce an XML description that
    could then be used to build almost anything.

    Pete.



    --
    I used to write COBOL; now I can do anything...

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From pete dashwood@1:2320/100 to comp.lang.cobol on Fri Nov 3 13:08:06 2017
    From Newsgroup: comp.lang.cobol

    On 3/11/2017 5:50 AM, docdwarf@panix.com wrote:
    In article <f5v0erFjvgkU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 2/11/2017 1:08 AM, docdwarf@panix.com wrote:
    In article <f5si2nF2japU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
    In article <f5q5suFftdqU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:

    [snip]

    When the next Database comes along you clone the code above. The logic >>>>>> doesn't change; only the interface (The COBOL record layout and the Host >>>>>> Variables)

    Mr Dashwood, if 'only' the COBOL record layout and the Host Variables >>>>> change is a system recompile (and resulting end-to-end testing to satisfy >>>>> Audit requirements) be something a Diligent Manager might, just possibly, >>>>> insist upon?

    No, that's the whole point. The Data Access Layer (comprised of modules >>>> like the one we are discussing) is only ever accessed through an
    interface. It is SEPARATED from the actual business logic. The whole
    system does NOT have to be re-compiled; ONLY the DAL object affected,
    and regression testing is only required for programs that invoke THAT
    DAL object.

    [snip of example, please research]

    ... then you're saying that all one has to do is code the Data Access
    Layer with the new DDL and the simple logic to determine whether the
    program accessing it is Old Code or New Code?

    [snip]

    What we might expect to happen here is that the COBOL COPY Book defining
    the record layout would be changed to the NEW version.

    '... record layout would be changed' == code is recompiled == code must be tested. For critical modules the preferred method is end-to-end.

    DD

    There is nothing in the approach described that precludes testing
    end-to-end.

    You just don't HAVE to, in order to establish that the change was effective.

    People with an OO background, who understand encapsulation, have no
    problem with this.

    Pete.
    --
    I used to write COBOL; now I can do anything...

    SEEN-BY: 154/30 2320/100 0 1 227/0
  • From docdwarf@1:2320/100 to comp.lang.cobol on Fri Nov 3 23:30:25 2017
    From Newsgroup: comp.lang.cobol

    In article <f61qbaFcfebU1@mid.individual.net>,
    pete dashwood <dashwood@enternet.co.nz> wrote:
    On 3/11/2017 5:50 AM, docdwarf@panix.com wrote:

    [snip]

    '... record layout would be changed' == code is recompiled == code must be >> tested. For critical modules the preferred method is end-to-end.

    There is nothing in the approach described that precludes testing >end-to-end.

    You just don't HAVE to, in order to establish that the change was effective.

    Mr Dashwood, I have known auditors who vehemently disagree with this statement.

    People with an OO background, who understand encapsulation, have no
    problem with this.

    Ahhhh, the old (and frequently smug) You Just Don't Understand... Mr
    Dashwood, I have known accountants with an OO background who would agree strongly with the aforementioned auditors.

    DD

    SEEN-BY: 154/30 2320/100 0 1 227/0