Re: Packages in QL
Posted: Sun Jan 05, 2025 3:59 am
I'm writing quite a long contribution to this discussion, relevant beause of my efforts during Covid-19 lockdowns to produce a text-browser for QL systems called Q-Browse and a spin-off project called QLapp (the name was a pun on QL APPlications and Clapping) as a kind of downloaded applications manager, inspired by the apps-handling user-experience on Android systems. Unfortunately, the Forum timed out before I finished the original message and in logging me out also caused loss of the entire post as I hadn't draft saved it. That'll teach me. Which is how I come to be here at 3am retyping this as I couldn't sleep after it happened while it was pretty late the first time round.tofro wrote: Sat Jan 04, 2025 10:49 am To be honest, in my opinion a package manager is a bit of overkill to the QL (WARNING: A bit of a rant incoming).
First of all, a package manager needs some sort of persistent database of packages, files and locations. Not very viable on, for example, floppy-based systems.
Second: package managers aren't simple pieces of software. To achieve something like only a subset of, for example, RPM on Linux, you'd need a substantial piece of code. And, if something goes wrong, problems are becoming much harder to fix for the average user than with other, more traditional and more transparent methods. Handling dependencies (like "needs PE version xxx and up") would probably have you create a software monster in QL dimensions...
Third: The variation of systems to target is just too big: There's people with "hard disk"-based systems that will only ever run a single BOOT that includes everything but the kitchen sink (and they maintain as some sort of sacred piece of software they'd never want someone else to mess with...), on the other end, people using floppy or mdv-based systems that will run amultitude of BOOT files, all different.
And fourth: The QL (and, even more, SMSQ/E) includes everything you could possibly need to build single-file installs. Compilers allow you to include all necessary files within one binary, and, if your program really needs some sort of specific install directory, the HOME system allows you to neatly do that.
This goes a bit along Per's recent rant on programming: If people would put a bit of thought into their programs, they could achieve nearly everything with a single file install, or, if more sophisticated, a single directory install plus HOME_DIR. Unzip, done, run. And if you can't expect them to do even that, why would you expect them to create the necessary information for a package manager?
To sum it up: The only "package manager" that I reasonably need and want is acp_obj
I broadly agree with what's been said in this thread about Package Managers and the like, especially what I've quoted above from tofro.
I was going to mention my projects in some detail because once you see it from the point of view of what I went through with those projects that although I wasn't successful in the end and abandoned or postponed both projects, partly because the end of lockdown made me too busy again and partly because I never fully got to grips with https, my experience is relevant in explaining my views and proving what tofro says. So I'll go into some detail to explain why. I politely ask you read it all in order to understand where I'm coming from with this.
The migration of most sites in recent years to using https for obvious reasons made my original work largely redundant because my projects only ever only worked with non-https sites, thus making them pointless now. If someone could teach me how to reliably dialogue between clients and servers to download https pages and stuff with working examples of the exchange of commands necessary I might pick up on the work.
Even when we get the https commands dialogue between client and server working, the next hurdle I faced was encryption. In theory, over https transfers, a client can tell a server which encryption systems it can handle, and a server can also tell a client what it is able to do, so they come to an "agreement" on what compression/encryption to use. Or the server decides "if you can't handle this format, tough, it's the only one I'm prepared to use, abort the page/file request with a suitable message to the user". Of the common encryption systems used these days with https, we only have GZIP, although I live in hope that some clever QL person knowledgeable in C or other relevant languages will one day port working versions of those other common encrypting/decompression utilities used by https.
Going back to the https commands dialogue before a download even starts, I've tried with software like Wireshark and browsers' own debugging tools to see the exchange of commands made by Windows browsers and to replicate them (in conjunction with HTTPS documents out there) in an attempt to access download using https. Although on the surface of it my efforts were exactly the same as the Windows browsers from what was visible to me in tappers and debuggers, and conformed to published documentation of https I had at the time, they just didn't work and many sites actually banned me from accessing after a few unsuccessful attempts. I'm pretty sure there must have been more parts of the dialogue I was unable to see while viewing the dialogue between a Windows browser and a server, leading to me missing out certain vital steps not visible for me to replicate. It never got further than the attempted exchange of commands prior to starting any downloads (I had non-https downloads working OK, they were fairly trivial). Clearly, there was something happening which I couldn't see and the lack of working examples in HTTPS documentation didn't exactly make it easy. They all assume you are using pre-written standard libraries and there are no examples of how to your own very basic handshaking for fairly simple HTML downloads etc. Also, there was the issue of suitable de-compressors for QL for the encryptions commonly used during https sessions - GZIP was the only one readily available on QL. But even before you got to the decompression/decrypting stage there was something about the exchange of https commands with servers I never fully got to grasp with because of the lack of working examples even though my attempts LOOKED identical to what browsers did when I looked at the exchange between client and server.
Anyway, bypassing too much detail of that part of the story, in parallel with my browser, which worked at a basic level with non-https pages, I started on QLapp as a fairly basic download package manager. It was inspired by the user experience of apps on Android - you choose/search for what you want to download, QLapp downloaded it and installed it for you without you having much knowledge of what was happening (in other words a comfortable experience),all within the QDOSMSQ environment on suitably equipped systems. It would download with each package a parallel list which gave the QLapp manager a list of settings to use, requirements (TK2, pointer environment, expanded RAM, specific display modes such as QL 512x256 only), whether the program understood PROG_USE/DATA_USE/Home Directory etc, where the program expected to be installed by default and so on, extensions expected, how messy the boot programs were e.g. complications such as a BOOT or the program itself loaded a picture direct to 512x256 screens or attempted assorted POKEs which might only work on one type of system (scenarios much as tofro described) - intended to do as much of the work for you as possible when program packages were suitable to do this. Much of the "settings" part of this was already part of Launchpad, QDock etc once the downloading was done, the main difference being that instead of you telling the program what to do, if it knew how to do it, it did so with no or minimal input from you. I found it easier to download a text file connected to a zip file rather than have a single huge database for all known QL software, simply because a majority of older software was so horrendous in terms of settings needed (imagine the clumsiest most unwieldy large boot program you could imagine installing tons of extensions, POKEing everywhere, loading screens direct to QL video RAM, doing things which only worked on one ROM version and many more "impossible" scenarios).
It was all well and good Per expecting programs to be well written, that ain't happening with most software out there of any age. In general if anything older programs are most of what any applications manager packages would be expected to handle.
After a while I realised the QLapp project was becoming more and more the proverbial "monster" and that compiled BASIC was not the best way of doing it anyway (the only language I was sufficiently proficient and confident with to embark on such a project). It quickly became a much bigger project than I expected and once lockdowns ended I became too busy again to keep working on it, in addition to the issues I was having with mastering HTTPS. With hindsight, I should have sought help from people who already used HTTPS at this level before getting to that stage, but the end of lockdowns coinciding with where I was at by then made me too busy to feel able to pursue it further, however well or badly the project might have been heading. Maybe with help on HTTPS specifics there might come a point when time allows I could pick up on it once more, I don't know. In case anyone feels they can help, what I need more than anything is a sample of a couple of known working routines which takes me through and past the exchange of commands to the server right up to the point at which the actual page or file download starts. I was getting stuck somewhere between the first contact with the server and the download starting - that part of it all was just a plain text exchange of commands and responses between server and client before any encryption took place. With luck with a few clear examples I can work out where I was going wrong. Absolutely none of the documentation or examples online show you how to do it at this level using just plain text exchanges of commands, what few examples exist online all seem to assume you are using ready-written standard libraries which of course we don't have and can't use. One or two good, clear and known-to-be-working examples of an https dialogue exchange of commands prior to download might break the impasse I experienced. I remember getting so frustrated - comparing what I was doing from QPC2 to what a Windows browser was doing and relating that to the HTTPS documentations and realising that although it looked like I was doing everything right, I was clearly not doing so and fond it impossible to get past that point.
Q-Browse already works to a good degree with non-https pages on my site, enough to show the text browser project was totally viable to a degree. And at a basic text browsing level with hyperlinks and things like viewing image links using external viewers and other programs which better handled certain matters than trying to include such code within Q-Browse it was all going well until https became a problem. My browser worked quite simply by reducing the received HTML to plain text, ignoring tags it didn't understand and things like Javascript, and in reducing the HTML to plain text compiled a list of links to other pages and images including information about where they occurred in the HTML, which could then be displayed by clicking on the links, looking up where you clicked in the links lists to find if you'd clicked on somewhere where a link happened to exist and if necessary invoke an external viewer such as Photon for JPG images, unzip etc, the external programs you preferred to use configurable based on the extension of the downloaded file.
Things like accessing search engines were also fairly straightforward - typing the search word or phrase into a box, embedding the search query in a link to an external search engine (many allow you to do this, or did back then anyway). So the browser part would have been so much more viable had I succeeded with HTTPS. After all its main purpose was just to access my site and download the zips within the QDOSMSQ environment if there was TCP/IP access from QDOSMSQ of some form on the system. Nothing much more advanced than that. A lot of work remained to be done, but it did get far enough with non-https transfers to prove its viability. At the time it seemed so viable it spurred me on to the QLapp idea. At the time I thought of QLapp as a kind of QL app store, but in reality it was little more than a cut-down Q-Browse which knew how to download packages and what to do with them after download, or if it didn't, just say so to the user and leave them to handle installation.
The QLapp part might be viable but needs a lot more work because of the point at which I gave up on Q-Browse for what I thought was a temporary period after lockdowns ended until we all knew where we stood again and normality returned. It did however progress far enough to teach me that such an applications manager (at least, how I saw it panning out at the time) is a nightmare project, not intrinsically in itself necessarily, but more because of the QL software. Like QL filing systems, every program seemed to have its own inconsistent and often frustrating way of doing things, most didn't understand simple concepts like the Home Directory (which allows software to know which directory they are running from, a concept alien to our systems until people like Wolfgang tackled it).
Every program apart from a few modern PE programs written to use later and more standardised ways of doing things like understanding "Home Directory" concepts seemed to have their own way of doing things. Installation varied enormously from program to program, everything seemed different, usually the only simple case was the single-file programs. Just an executable, no other files to worry about. The more I looked at it, the bigger the problem seemed to become. I realised a large 'database' of QL software wasn't the way to go, most people only downloaded the occasional program, it was easier to have something like a text file with each package describing the requirements - drive/directory it expected to run from, invoke MenuConfig if it needed to be user-configured via a config block, whether it needed pointer environment, expanded RAM, Toolkit 2, certain extensions and so on. If QLapp didn't know what to do with a program, it basically did what tofro suggested - downloaded the zip file, asked where you wanted to unzip it to, called unzip then basically said "you're on your own now, this is too non-standard for me to handle". Just a browser and Unzip. When it came to that, you were probably better off using Q-Browse to download pages, zip files and images, including enough code in Q-Browse to do nothing more than call an external program to unzip whatever you downloaded or invoke Photon or GPTK or similar to display images and leave it at that. The easy alternative was to only have QLapp handle a tiny select subset of all the software on my site. And believe me, it would have been a small subset, I remember that with so many of the packages on my sites there was no hope a single program could cope with everything. Absolutely not a chance.
I hope you are starting to see that QLapp as an applications manager was quickly becoming the proverbial monster project and started to persuade me that the way to go (at the time anyway) was ready-to-run collections like Urs's QL/E and Black Phoenix rather than trying to go down the route of applications managers which quickly become impossibly large and unmanageable projects when looked at it in detail, despite sounding at first like relatively obvious and straightforward ideas.
I probably sound totally negative about this. Strangely enough, I'm open to being convinced I'm wrong and would take another look at it if (a) my problems with https could be overcome, and (b) we could think of a better way to handle the installations databases which would be needed.
Maybe if I got past the https obstacle I might revisit both projects now I'm retired, even though I feel busier than I've ever been because people keep realising I've retired, so in theory (to them) I have more time on my hands to help them by going on committees, helping them run their organisations, helping with local projects, sharing my experience, advising and helping with IT problems (hah, if only they realised who they were asking with that last one!) etc etc, and that's before the seemingly never-ending family matters I constantly seem to have to deal with, e.g. my disabled brother's sad situation. On the whole, I enjoy my voluntary commitments to so many local projects and organisations, it's not that I'm unwilling to say No to them but that so many of them only seem to ask ever more of you the more you do and less willing to do so themselves. Probably like the old adage, "if you want something done, ask a busy man."
I'm sorry for the length and detail of this answer. Thank you for reading it. I hope it goes to show and prove if you've made it this far that I've quietly been there in a lot of detail with this to know how hard and in many ways impractical a project such as a QL applications manager might be (and why I support what tofro said), although of course I'm very open to being convinced otherwise.
(As an extra aside, if anyone is able and willing to help me with https, as I have no idea where I was going wrong, I might revisit the very basic browser project in particular when I get time, needs less work on it than QLapp and it's something I'm always being asked when it will happen -I know we have Lynx but is has its complexities and shortcomings for us)