What was the reason for removing single file print functions? #100
Unanswered
RolandHughes
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Was it because you wanted to support user credentials?
Been doing C++ so long I don't remember if C has a good method other than that va_args nastiness printf() uses for optional parameters. Just about everyone has to roll their own single chunk of data print method. I've had to for the BdSpoolerDevice class in Ls-Cs.
I do generally commend (after cussing heavily) many of the 3.x design changes. I commented on them in this Ls-Cs issue. It does seem to lay substantial groundwork for interfacing with 3D house printers and 3D manufacturing printers.
I have not had time to delve into the "server" mentions of the documentation. Is it just for print server communications? Part of what I read made it sound like Cups could be utilized as a generic file transfer layer between desktop and NAS or NextCloud/Netware/insert-file-server-software-name-here. That is something Cups needs to move towards if it hasn't gotten there.
Since I'm old and been at IT for 40 years now, please allow some barefoot-in-the-snow chatter here.
Late-70s up to 1990 many distributed midrange and mainframe systems supported non-clustered printing. (Some of these systems are still in use today because it is not a bad design.) It was how we survived proprietary network technologies and proved more reliable than "the cloud."
Your order processing would be on an IBM mainframe (assuming you had volume and money) at corporate headquarters in some big-ish city. Your warehouses or PDCs (Parts Distribution Centers) would be scattered regionally. Each warehouse would have a midrange computer, DEC/VAX or whatever cheaper-than-IBM brand. The order processing would drop picking tickets in text file form to a specific directory on the WMS machine. Other items needing to be printed there would be dropped into different directories. (We didn't have TCP/IP and had interesting ways of having EBCDIC "text" files getting to the WMS machine as ASCII).
A series of batch jobs would wake up once every so often. Each job would look in one directory for files, validate they were something nasty, then queue them on the printer with the form that was associated with said directory. Once printing was successful the file would be nuked.
Print queues didn't care who got there first. If your file needed a form that wasn't currently on the printer you sat there and jobs that came in behind you needing the form currently on the printer got to print. None of this printer screaming about a person needing to change the paper because the job at the top of the queue needed something different. If you were different you got set aside and waited until you weren't different.
I'm starting to see this architectural design come back in this world where North Korea and Russia hack with wanton abandon. Now it typically has a sacrificial stand alone computer that is connected to the Internet having every virus scanner known to man installed on it. Once it physically inspects the file 8 ways to Sunday it uses its only form of communication with the "real" computer, usually MQSeries or some other message queue that exists on a different network and uses a different network card (Token-Ring is making a comeback here as are some other old proprietary protocols) The real computer gets the message and queues the print job.
This isn't an air-gapped design, it is a breathing straw design. Just a limited number of message queues, each of which will accept only one type of message.
Sorry if you consider this message a complete waste of time. Just sharing perspective from my side of the world.
I did find the deletion of the single file print functions most annoying though. I'm sure you had your reasons though.
Beta Was this translation helpful? Give feedback.
All reactions