Anyone here who can convert this MF code to PowerCobol?
Apparently pointers are treated differently.
https://stackoverflow.com/questions/39957668/how-to-start-using-sqlite-from-cobol.
Thank you for any help
Jm
On 22/10/2017 12:22 AM, JM wrote:
Anyone here who can convert this MF code to PowerCobol?
Apparently pointers are treated differently.
Yes, they are not required for Fujitsu COBOL.
https://stackoverflow.com/questions/39957668/how-to-start-using-sqlite-from-cobol.
Thank you for any help
Jm
You don't need any of the pointers or C language routines if you use Fujitsu.Thanks for the answer!
Fujitsu use a Pre-compiler pass so that the SQL statements can be
embedded directly in your program (prefixed with "EXEC SQL" and
"END-EXEC") That includes the CONNECT to whatever DB you want.
Why are you targeting PowerCOBOL? Are you wanting to make an obsolete primitive GUI interaction? You don't NEED PowerCOBOL for this, though
you CAN use it if you want to.
For actions against the database you could probably use a component
written in NetCOBOL for Windows. (You can invoke it from PowerCOBOL or a
Web Page or the desktop if you write it in the manner recommended by PRIMA.)
If you check out: https://primacomputing.co.nz/PRIMAMetro/RDBandSQL.aspx
you will find information that has bearing on this exercise. There are
three pages and a portal page, also a link to sample Fujitsu NetCOBOL
for Windows code that implements a general purpose Fujitsu COBOL
database module. This could be modified to meet your requirement in a
very short time. (I'm not giving you the link; you need to read the
material and then you'll find it...)
Presumably, if you are targeting PowerCOBOL, you have a set of Fujitsu manuals? You should read the chapter on using SQL; it is pretty simple
and straightforward.
If you do the homework and decide to have a go at the approaches
described, I'll help you convert the code if you get stuck.
Pete.
--
I used to write COBOL; now I can do anything...
domingo, 29 de Outubro de 2017 |as 13:26:06 UTC, pete dashwood escreveu:
On 22/10/2017 12:22 AM, JM wrote:
Anyone here who can convert this MF code to PowerCobol?
Apparently pointers are treated differently.
Yes, they are not required for Fujitsu COBOL.
https://stackoverflow.com/questions/39957668/how-to-start-using-sqlite-from-cobol.
Thank you for any help
Jm
You don't need any of the pointers or C language routines if you use
Fujitsu.
Fujitsu use a Pre-compiler pass so that the SQL statements can be
embedded directly in your program (prefixed with "EXEC SQL" and
"END-EXEC") That includes the CONNECT to whatever DB you want.
Why are you targeting PowerCOBOL? Are you wanting to make an obsolete
primitive GUI interaction? You don't NEED PowerCOBOL for this, though
you CAN use it if you want to.
For actions against the database you could probably use a component
written in NetCOBOL for Windows. (You can invoke it from PowerCOBOL or a
Web Page or the desktop if you write it in the manner recommended by PRIMA.) >>
If you check out: https://primacomputing.co.nz/PRIMAMetro/RDBandSQL.aspx
you will find information that has bearing on this exercise. There are
three pages and a portal page, also a link to sample Fujitsu NetCOBOL
for Windows code that implements a general purpose Fujitsu COBOL
database module. This could be modified to meet your requirement in a
very short time. (I'm not giving you the link; you need to read the
material and then you'll find it...)
Presumably, if you are targeting PowerCOBOL, you have a set of Fujitsu
manuals? You should read the chapter on using SQL; it is pretty simple
and straightforward.
If you do the homework and decide to have a go at the approaches
described, I'll help you convert the code if you get stuck.
Pete.
--
I used to write COBOL; now I can do anything...
Thanks for the answer!Mysql.
Well ... I know the instructions "EXEC-SQL ...", i already use to access
In this case, I wanted to have native access without using ODBC and verydynamic use for data processing locally and interface with third parties.
Powercobol absolete? I know, but it's what I have :(
Regards,
On 31/10/2017 4:50 AM, JM wrote:
If you go with DAL you will write ONE set of code. It does all possible >actions against a Database: Get Random, Get Sequential, Get Skip
Sequential, Insert, Update, and Delete.
When the next Database comes along you clone the code above. The logic >doesn't change; only the interface (The COBOL record layout and the Host >Variables)
In article <f5q5suFftdqU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 31/10/2017 4:50 AM, JM wrote:
[snip]
If you go with DAL you will write ONE set of code. It does all possible
actions against a Database: Get Random, Get Sequential, Get Skip
Sequential, Insert, Update, and Delete.
By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years back for a client who was so afraid of cleaning up dead data that they
were exceeding the 4Gb file-size limit.
[snip]
When the next Database comes along you clone the code above. The logic
doesn't change; only the interface (The COBOL record layout and the Host
Variables)
Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
change is a system recompile (and resulting end-to-end testing to satisfy Audit requirements) be something a Diligent Manager might, just possibly, insist upon?
On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
In article <f5q5suFftdqU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 31/10/2017 4:50 AM, JM wrote:
[snip]
If you go with DAL you will write ONE set of code. It does all possible
actions against a Database: Get Random, Get Sequential, Get Skip
Sequential, Insert, Update, and Delete.
By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years
back for a client who was so afraid of cleaning up dead data that they
were exceeding the 4Gb file-size limit.
Yep, I can well imagine that it did sound like that. I first cottoned on
to the idea of separation between data and business logic when I
developed a VSAM database back in the late 1970s for a large
multi-national in England. Long before there were RDBMS, we were writing >"access modules" using common logic, in much the way described.
[snip]
When the next Database comes along you clone the code above. The logic
doesn't change; only the interface (The COBOL record layout and the Host >>> Variables)
Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
change is a system recompile (and resulting end-to-end testing to satisfy
Audit requirements) be something a Diligent Manager might, just possibly,
insist upon?
No, that's the whole point. The Data Access Layer (comprised of modules
like the one we are discussing) is only ever accessed through an
interface. It is SEPARATED from the actual business logic. The whole
system does NOT have to be re-compiled; ONLY the DAL object affected,
and regression testing is only required for programs that invoke THAT
DAL object.
In article <f5si2nF2japU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
In article <f5q5suFftdqU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 31/10/2017 4:50 AM, JM wrote:
[snip]
If you go with DAL you will write ONE set of code. It does all possible >>>> actions against a Database: Get Random, Get Sequential, Get Skip
Sequential, Insert, Update, and Delete.
By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years >>> back for a client who was so afraid of cleaning up dead data that they
were exceeding the 4Gb file-size limit.
Yep, I can well imagine that it did sound like that. I first cottoned on
to the idea of separation between data and business logic when I
developed a VSAM database back in the late 1970s for a large
multi-national in England. Long before there were RDBMS, we were writing
"access modules" using common logic, in much the way described.
[snip]
When the next Database comes along you clone the code above. The logic >>>> doesn't change; only the interface (The COBOL record layout and the Host >>>> Variables)
Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
change is a system recompile (and resulting end-to-end testing to satisfy >>> Audit requirements) be something a Diligent Manager might, just possibly, >>> insist upon?
No, that's the whole point. The Data Access Layer (comprised of modules
like the one we are discussing) is only ever accessed through an
interface. It is SEPARATED from the actual business logic. The whole
system does NOT have to be re-compiled; ONLY the DAL object affected,
and regression testing is only required for programs that invoke THAT
DAL object.
So... when the current database has DDL which generates (for a client
number, last order date, last order value and last order ship-to US postal code)
05 CLINUM PIC 9(6).
05 LSTORDDT PIC 9(6).
05 LSTORDVL PIC 9(8)V9(2).
05 LSTORDZP PIC 9(9).
... and the company starts doing business internationally so a
Corner-Office Idiot dictates the last order's currency symbol be included
and the postal code be changed for alphanumerics, resulting in:
05 CLINUM PIC 9(6).
05 LSTORDDT PIC 9(6).
05 LSTORDVL PIC 9(8)V9(2).
05 LSTORDCC PIC X(3).
05 LSTORDPC PIC X(9).
... then you're saying that all one has to do is code the Data Access
Layer with the new DDL and the simple logic to determine whether the
program accessing it is Old Code or New Code?
DD
On 2/11/2017 1:08 AM, docdwarf@panix.com wrote:For me, new developments will never be made in Cobol. Finished.
In article <f5si2nF2japU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
In article <f5q5suFftdqU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 31/10/2017 4:50 AM, JM wrote:
[snip]
If you go with DAL you will write ONE set of code. It does all possible >>>> actions against a Database: Get Random, Get Sequential, Get Skip
Sequential, Insert, Update, and Delete.
By Godfrey, it sounds like a VSAM I/O module I wrote in COBOL a few years >>> back for a client who was so afraid of cleaning up dead data that they >>> were exceeding the 4Gb file-size limit.
Yep, I can well imagine that it did sound like that. I first cottoned on >> to the idea of separation between data and business logic when I
developed a VSAM database back in the late 1970s for a large
multi-national in England. Long before there were RDBMS, we were writing >> "access modules" using common logic, in much the way described.
[snip]
When the next Database comes along you clone the code above. The logic >>>> doesn't change; only the interface (The COBOL record layout and the Host >>>> Variables)
Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
change is a system recompile (and resulting end-to-end testing to satisfy >>> Audit requirements) be something a Diligent Manager might, just possibly, >>> insist upon?
No, that's the whole point. The Data Access Layer (comprised of modules
like the one we are discussing) is only ever accessed through an
interface. It is SEPARATED from the actual business logic. The whole
system does NOT have to be re-compiled; ONLY the DAL object affected,
and regression testing is only required for programs that invoke THAT
DAL object.
So... when the current database has DDL which generates (for a client number, last order date, last order value and last order ship-to US postal code)
05 CLINUM PIC 9(6).
05 LSTORDDT PIC 9(6).
05 LSTORDVL PIC 9(8)V9(2).
05 LSTORDZP PIC 9(9).
... and the company starts doing business internationally so a Corner-Office Idiot dictates the last order's currency symbol be included and the postal code be changed for alphanumerics, resulting in:
05 CLINUM PIC 9(6).
05 LSTORDDT PIC 9(6).
05 LSTORDVL PIC 9(8)V9(2).
05 LSTORDCC PIC X(3).
05 LSTORDPC PIC X(9).
... then you're saying that all one has to do is code the Data Access
Layer with the new DDL and the simple logic to determine whether the program accessing it is Old Code or New Code?
DD
It would certainly be possible to add a flag in the interface saying
WHICH COBOL record definition was required (the old one or the new one)
and extend the DAL object to deal with either. But the general idea is
NOT to change the DAL object's logic. (For PRIMA, we generate these
objects anyway, so the code is normally consistent.)
What we might expect to happen here is that the COBOL COPY Book defining
the record layout would be changed to the NEW version. This record
layout is what the applications "see" in the interface to the DAL
object. It is shared through LINKAGE between the DAL object and the application.
The Host Variables for the new definition would be DECLGENed and added
to the DAL object. (Again, we have tools that do this automatically when
the DAL object is generated, but you can certainly do it by hand and I
have done...You can see typical COBOL code that loads and unloads the
HVs in a DAL object by viewing: https://primacomputing.co.nz/PRIMAMetro/demosandtutorials.aspx Click on "video 7" - it is about 20 minutes. I am slowly working through these
videos to make them shorter and more pertinent to the latest Toolset releases, but the principles involved are still the same.) The DAL
object gets re-compiled. Note that it's LOGIC has not changed; only the record(s) (the record may be REDEFINEd as in standard COBOL...) it is required to construct/deconstruct, and the HVs required to do that.
The applications which CALL/INVOKE this DAL object/module get recompiled
so the new record layout is available, and, of course, the logic to deal with the new/changed fields is implemented into the application. This
should be pretty minimal, as only a few fields have changed. (A system
Data Dictionary can be useful for finding which applications use this
DAL, but it really depends on the size of the system. Our experience
shows that it is much quicker and easier to find where a DAL object is
used than it is to scan for every ESQL statement that may use one of the
DB columns that was changed...)
These applications (the ones affected by the change) get regression tested.
There is NO duplicated SQL in the system and the Applications deal only
with the record layout, without worrying about the mechanics of HOW it
is populated.
This concept of using a separated DAL layer is useful in many languages,
but for COBOL, which is innately "record oriented", it fits extremely well.
Pete.
--
I used to write COBOL; now I can do anything...
On 2/11/2017 1:08 AM, docdwarf@panix.com wrote:
In article <f5si2nF2japU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
In article <f5q5suFftdqU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
When the next Database comes along you clone the code above. The logic >>>>> doesn't change; only the interface (The COBOL record layout and the Host >>>>> Variables)
Mr Dashwood, if 'only' the COBOL record layout and the Host Variables
change is a system recompile (and resulting end-to-end testing to satisfy >>>> Audit requirements) be something a Diligent Manager might, just possibly, >>>> insist upon?
No, that's the whole point. The Data Access Layer (comprised of modules
like the one we are discussing) is only ever accessed through an
interface. It is SEPARATED from the actual business logic. The whole
system does NOT have to be re-compiled; ONLY the DAL object affected,
and regression testing is only required for programs that invoke THAT
DAL object.
... then you're saying that all one has to do is code the Data Access
Layer with the new DDL and the simple logic to determine whether the
program accessing it is Old Code or New Code?
What we might expect to happen here is that the COBOL COPY Book defining
the record layout would be changed to the NEW version.
In article <f5v0erFjvgkU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 2/11/2017 1:08 AM, docdwarf@panix.com wrote:
In article <f5si2nF2japU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
On 1/11/2017 12:35 AM, docdwarf@panix.com wrote:
In article <f5q5suFftdqU1@mid.individual.net>,
pete dashwood <dashwood@enternet.co.nz> wrote:
[snip]
When the next Database comes along you clone the code above. The logic >>>>>> doesn't change; only the interface (The COBOL record layout and the Host >>>>>> Variables)
Mr Dashwood, if 'only' the COBOL record layout and the Host Variables >>>>> change is a system recompile (and resulting end-to-end testing to satisfy >>>>> Audit requirements) be something a Diligent Manager might, just possibly, >>>>> insist upon?
No, that's the whole point. The Data Access Layer (comprised of modules >>>> like the one we are discussing) is only ever accessed through an
interface. It is SEPARATED from the actual business logic. The whole
system does NOT have to be re-compiled; ONLY the DAL object affected,
and regression testing is only required for programs that invoke THAT
DAL object.
[snip of example, please research]
... then you're saying that all one has to do is code the Data Access
Layer with the new DDL and the simple logic to determine whether the
program accessing it is Old Code or New Code?
[snip]
What we might expect to happen here is that the COBOL COPY Book defining
the record layout would be changed to the NEW version.
'... record layout would be changed' == code is recompiled == code must be tested. For critical modules the preferred method is end-to-end.
DD
On 3/11/2017 5:50 AM, docdwarf@panix.com wrote:
'... record layout would be changed' == code is recompiled == code must be >> tested. For critical modules the preferred method is end-to-end.There is nothing in the approach described that precludes testing >end-to-end.
You just don't HAVE to, in order to establish that the change was effective.
People with an OO background, who understand encapsulation, have no
problem with this.
Sysop: | DaiTengu |
---|---|
Location: | Appleton, WI |
Users: | 1,004 |
Nodes: | 10 (0 / 10) |
Uptime: | 222:08:51 |
Calls: | 13,080 |
Calls today: | 1 |
Files: | 186,574 |
Messages: | 3,300,370 |