LLStore just got blocked from Mediafire - must manually download them :(

Glenn

Administrator
Staff member
ARGHHHHH - Mediafire just made my direct downloads go to a download page, breaking my LLStore using it!!!

Now requires users to generate tokens for a valid download, I am stuffed now.

I'm heading away for a few days, I can't believe they just made the feature I needed unusable after a certain usage limit... we'll I can't do anything about it. so just assume you'll have to visit and download them manually and put them on a USB or your main HDD and install from there:

Apps:

Games:

Sorry, but I am out.
 
I've disabled the Repositories from Mediafire, it can't work as it is, so I have nowhere to host apps and games over 100mb now :(
 
is 1tb pro

i am uploading to archive.org and they allow direct links, i even had AI help me upload items if they dont already exist etc, so we will see how that goes.
 
The First batch of Uploads said successful, but they aren't working, I am trying again and it's 85% done, with any luck I'll have direct links to all the large Apps, I can re-enable the Mediafire repository and the WebLinks.ini will redirect them to the new archive.org download links instead. We'll see how it goes, I'll do what I can to get it running in the morning as I am away for a few days and would like to have it operating normally again, I'll not have the ability to disable the repository if it does stop working once I leave, so keep that in mind if I post here about it stopping working in the next few days, else you'll be good to go ;)
 

I am Building it here, I did the test of Adobe Photoshop and it worked, so I am doing a full Repository test that I'll execute tomorrow morning.

The plan is I'll upload the database / metadata and screenshots to GitHub as these are tiny and easily fit into githubs size limits, so LLStore will download the DB on loading, then as you click items it downloads the screenshots and icons. Finally if you pick it to install it redirects the File Name (Say Rufus_v11_x64_ppApp.apz) and if found it will grab the Direct link from the WebLinks.ini and download it, which when it completes it will extract and install like a local item - it downloads to Home/zLastOSRepository/ which is also scanned by LLStore, so previous downloaded items also get listed (and where you can copy them out to place on your collection or on USB, in Windows it goes to C:\Users\UserName\Documents\zLastOSRepository or on LastOS's with the Docs drive set to D: - D:\Documents\zLastOSRepository.

So as you can see a simple approach and git is much faster than archive.org, but I am limited to 100mb, so I used Google Drive first, that required special tools to convert public links to direct links, so I brought Mediafire Pro and then found out it's limited to 1TB limits per month. So I'll try Archive.org and see if they are suitable, using real world tests.

13/17GB uploaded, so the bandwidth is much smaller than Mediafire, but if it works at all, It'll have to do.
 
Okay, so when you're done, send me the links; and I'll put copies on my HFS ;)
 
Last edited:

Well it's the same apps I had uploaded most recently to Mediafire, they aren't new. But I tested it and updated WebLinks.ini (using AI to download the page from Archive.Org and it generated what I copied and pasted to WebLinks.ini - no messing about nowdays :)

I re-enabled the Mediafire Repo (not games as they aren't uploaded until next week).

Code:
#!/usr/bin/env bash

BASE_URL="https://archive.org/download/LastOSLinux_Repo_2026-01"
HTML_FILE="listing.html"
OUTPUT="file_list.txt"

echo "Downloading listing..."
curl -s "$BASE_URL" > "$HTML_FILE"

# Function to URL-decode strings (for filenames only)
urldecode() {
  local data="$1"
  printf '%b' "${data//%/\\x}"
}

echo "Parsing, decoding filenames, and sorting..."
awk -v base="$BASE_URL" '
  {
    while (match($0, /href="([^"]+\.(apz|tar|pgz))"/, m)) {
      encoded = m[1];
      url = base "/" encoded;
      print encoded "|" url;
      $0 = substr($0, RSTART + RLENGTH);
    }
  }
' "$HTML_FILE" | while IFS="|" read -r encoded_fname url; do
    decoded_fname=$(urldecode "$encoded_fname")
    echo "${decoded_fname}|${url}"
done | sort > "$OUTPUT"

echo "Done! Alphabetically sorted output saved to: $OUTPUT"

Here is the AI's code, I just provide the inputs and it worked first go, then I had to ask it to sort the list file alphabetical.

The other thing I did was say thank you because it knew the context as to why I needed the script. It then told me I ask questions really well... So I guess I am training the AI and the AI is training me :P
 
This has been rectified for now, I use archive.org instead, it's a little slower to download/install, but it's better than no access at all.

I also update LLStore to show download percentages again, it also cleanly exits when you close or skip the MiniInstaller items as before I would attempt to install it as though it had completed successfully. So slow progress is being made. Time for me to go to work for a few hours, then I'll get back to it.
 
The first one was back when I attempted to include the databases alongside the files, now I use WebLinks.ini to "forward" from a download request to wherever I want :)

No it's not all, it's missing the ones you can find on Mediafire, the ones over 2gb in size are still not uploaded.

 
Back
Top