From Newsgroup: comp.lang.cobol
In the light of occasional emails asking me how it's going, the
following is a copy of a post to the News page of our web site.
It is also being copied to LinkedIN:
=====================================
Thanks for your enquiry and interest.
As I mentioned in the correspondence, we were expecting to launch
Release 2 back in October 2016.
However, as we ran various Proofs of Concept (POCs), things came to
light which needed much more consideration. (ThatrCOs what POCs are for...)
PRIMArCOs policy is NOT to release software with known bugs or
deficiencies and we try to get it right before we go public. All our
software carries a lifetime guarantee and we will fix it for free if
bugs are found in any existing processes we release.
There were things that people wanted which we should reasonably provide,
and some of them are pretty complex to implement (like the List View you
saw in the Spanish Postal video).[
http://primacomputing.co.nz/videos/SpainPOC.MP4]
Support for tabbed screens, Menus, Embedded SQL in PowerCOBOL, details
of Controls and Attributes needing support in the GUI Support Class, all
kept us pretty busy. And, obviously, paid work had to take priority. By February of this year the screen generation was 100%, and all of the
complex wiring which goes on to connect your screen with events in the code-behind is now complete and has successfully passed our Test Plans.
So we then turned our attention to the code-behind generation.
The generation of rCLRawrCY code-behind is pretty much OK but the problems
are in Transforming the PowerCOBOL scriptlet references to PowerCOBOL Controls, into .NET COBOL references to Windows Forms controls. The
control has to be recognised (along with its attributes) and then code
to invoke the GUI Support Class must be generated. The GUI Support Class
also has to be extended as new support is added to it.
It is perfectly possible (for advanced users with good COBOL and C#
skills) to write your own support into the COBOL code-behind and to
extend the GUI Support Class yourself (we will provide the C# source
code for it free, to any client who asks for it, and we will also
provide help and guidance on extending it) so that any third party
controls or extended PowerCOBOL controls (like the ADO control, for example...) can be supported. However, I would like Release 2 to support
as much as possible rCLout of the boxrCY.
As we ran hundreds of thousands of scriptlet lines through the Migration process we needed to keep adjusting the Transformation process, and the
result is that it has become over maintained. (It also slowed
significantly as more Controls were processed; every word of every line
of COBOL needed to lookup a table to see if it was a control... not good
:-).) With all the experience thus gained, we have designed a completely
new approach to Transformation that will recognise COBOL syntax across
source lines and knows which words are candidates to be controls. Transformation is still a non-trivial process but our early results for
the new code are very encouraging.
So, the current position is that we are continuing to develop the new Transformation process, while also doing the paid work which keeps the
company running. I cannot give an exact date for when this process will
be release-ready. (It depends on other work and priorities). However,
the project WILL be completed this year and I am hopeful it will be
before July/August.
Obviously, the more rCLearly adoptersrCY we get, the higher the priority becomes... :-)
When we have something ready for release, it will be announced on the
web site.
Meanwhile, I am copying you on a response to a client who is considering
data migration from ISAM to RDB and is using PowerCOBOL. This may give
you some food for thought:
"One of my clients is under pressure to migrate COBOL data files to RDB
as soon as possible but the problem is that they insist on doing it
before PCOB2NET migration. This is not the best option I know :-("
[Pete says...]
It is perfectly fine to do that PROVIDED the code is NOT PowerCOBOL.
(IrCOll explain the problem with PowerCOBOL in a momentrCa) Even if it IS PowerCOBOL, the toolset will create and load the new database OK, and
you can still generate the DAL layer with either ESQL or LINQ. The
problems arise when you come to access the new database.
In order to do this, the existing code (which has standard COBOL verbs
for accessing the files) has to be Transformed so that all references to legacy flat files, are turned into SQL references to the new database.
The Migration Toolset Transformation process addresses this problem.
Transformation does NOT generate Embedded SQL into the legacy code.
(That is what most people do when migrating without tools; they locate
every COBOL READ (for example) and change it to a SELECT on the table
which corresponds to the old flat file record. it is a disasterrCa you now have hard coded SQL scattered through the applications and you have a
database that is far from optimized in any way; it isnrCOt normalized and
it has redundant fields on it (from groups and REDEFINES.)
PRIMA has a MUCH smarter approach than that, and it is FULLY automated.
We DONrCOT generate a single table for each legacy record. We create a SET
of tables in 3rd Normal Form, and we MANAGE that table set with a
generated OO Class that happens to be COM compliant, so that COBOL can
use it. The collection of these Classes, which manage the entire
database is referred to as the rCLData Access LayerrCY (DAL). The DAL layer can be generated in COBOL using Embedded SQL (the default), or it can be generated in C# using LINQ (this is many times more powerful and ideal
for clients who need really high performance from their DB. COBOL
programs do not normally have access to LINQ so this is a major
advantage and provides the benefits of modern DB technology to legacy
COBOL code.) The source language of the DAL really doesnrCOt matter
because it is never maintained; it is generated and compiled
automatically. In either case (ESQL or LINQ) the COBOL legacy just
invokes the same methods and interface from the DAL object, using
standard COM interaction.
(The first PRIMA client who agreed to try using LINQ was amazed when he
found he didnrCOt even have to RE-COMPILE a single program! We generated
the LINQ DAL objects and replaced his existing DAL library with the new
one (re-registering the COM components for each tableset). It all worked perfectly, and all he saw was a performance improvement. (5x faster in
some casesrCa)
The DAL layer has all the SQL (or LINQ) but, instead of an SQL call for
every access, hard coded into the application, the DAL layer is smart
and uses pre-compiled generic SQL calls which were tailored at
generation time to support specific table sets on the DB. This approach provides a separation layer between the Business logic in the
applications and the data maintenance requirements.
The DAL objects provide the application code with exactly the same
buffer it would have received if it was accessing a flat file, but there
is no hard coding required to do it. Applications pass the same COBOL
record buffer to the DAL and the DAL objects decompose it and write it
to the correct places on the database. (If you wanted to run a COBOL
batch process against the database it works exactly the same way. The
Batch COBOL invokes the DAL instead of using its own hard-coded SQL and
it receives the record formats it is expecting. DAL can be invoked from desktop code (in any language that supports the Component Object Model
(COM) rCo that is all of the major languages ,including COBOL) rCo or it can be invoked from Javascript, VBScript, or JSON on a web page.
Separation means the data access can be tuned and optimized
independently of the applications with no impact on application code
when changes are made. (A new DAL object is generated to support the
affected tableset if definitions are changed; there is no impact on the applications that use it.) Applications that donrCOt use new fields will continue to process as they always have; applications which DO need the
new fields will process the new fields through the DAL layer.
So, summarizing, the process of Migration Transformation creates the connection between the legacy COBOL and the new DAL layer, so that the
legacy code is able to access the new DB exactly as it did the old flat
files. However, the DB is NOT just the old record layout as a table;
rather, it is in 3rd Normal Form with repeating groups (OCCURS) broken
out, key dependencies checked, and optimized so that COBOL group fields
and REDEFINES fields are NOT stored on the DB. (The DAL objects
automatically create these fields when they reconstruct the COBOL record layout in memory, but they are actually accessing fewer fields than if a
COBOL record layout had been stored, and they may be doing it using
Language Integrated Query (LINQ) which has optimizations in it that are
simply not available to ESQL.
We read the legacy source, Transform it, then recompile it with Net
COBOL. Done. Legacy runs as it always has; if it worked before
Transformation, it will work after Transformation. (The Toolset makes NO CHANGES to the legacy program logic.)
So, why is PowerCOBOL a problem?
To do the process for PowerCOBOL we have to get access to all the
scriplets, Transform every one to invoke the DAL layer, then replace the
code in the PowerCOBOL project so it can be re-compiled through the
PowerCOBOL IDE.
The Toolset has no problem extracting all the legacy scriptlets from the project, it has no problem Transforming them and creating a new
scriptlet for each one, BUT it CANNOT replace the new scriptlet into the PowerCOBOL project! These projects are NOT updateable under program
control; they MUST be updated by hand.
We found through bitter experience that the process of cutting and
pasting the Transformed scriptlets back into their PowerCOBOL projects
is error-prone and tedious in the extreme. I enhanced the Toolset to
make it very clear in the scriptlet code where it needs to be cut and
pasted but even so, it is not something you would want to do across
dozens of PowerCOBOL projects. (Nevertheless, you CAN do itrCaand, at the moment, PCOB2NET is not available anywayrCa)
The reason we recommend PowerCOBOL clients to go with PCOB2NET is
because you DONrCOT have to maintain the PowerCOBOL projects if you go
this route.
You generate the new form, generate the code-behind, create your new
database, generate the DAL layer, Transform the code-behind to use the
new DAL layer, re-compile it, and yourCOre done.
I hope this explanation makes it clearer. Remember to review the videos,
and please, let me know if there are still things that are not clear.
Best Regards,
Pete.
--
I used to write COBOL; now I can do anything...
SEEN-BY: 154/30 2320/100 0 1 227/0