Jump to content
Ketarin forum

shawn

Moderators
  • Content Count

    1,107
  • Joined

  • Last visited

About shawn

Profile Information

  • Gender
    Male

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I just want to say thanks again for this, @floele. I was fiddling with my after-update scripts and this thread helped me fix another bug.
  2. There are several examples on how to use 7-zip or UniExtract to extract compressed files in the wiki. Both 7-zip and UniExtract support RAR files. The syntax for the command-line version of 7-zip is: 7z x -y -o"{root}PATH\" "{file}" If you want to include a password you would do so like this: 7z x -y -p"th3p455w@rd" -o"{root}PATH\" "{file}"
  3. This is most likely caused by the URLencoding of space and ampersand. You can either use the ":replace" or ":multireplace" functions or you can use the :urldecode/:urlencode functions - both of which can get what you're after, but depend on you knowing which ones need to be replaced for the specific website and downloaded files. "%20" is a space. "%26" is an ampersand. Many non-English characters may also be replaced - but only on some websites. Unfortunately, it's really a matter of observation and testing for the individual website. When I'm troubleshooting stuff like this I always echo
  4. The ability to replace the user-agent string has become essential. The Cloudfront servers that are responsible for Logitech, SnagIt and others are now completely blocking the HeadlessChrome user-agent.
  5. Or maybe a configuration file that it will parse to populate these variables.
  6. The cache appears to only survive about 10 seconds - so on a large download or on a slow site, it may not keep the cache long enough for the post-update scripts to run without re-downloading the variable webpages. I've changed the command used to extend the timeout to: if(!(PS kuppet -ea 0)) {START -WindowsStyle hidden BIN\kuppet "8008 30000"} This increases the timeout to 30 seconds and resolves my immediate problem, but it will not accept any value above 30000 (30s). I suspect there is a hard limit in place to ensure that Kuppet doesn't stay in memory forever. Unfortunately, this forc
  7. It is caching - yay! I didn't read the log very carefully. It looks like the parameters are: kuppet.exe [port] [timeout-ms] It disconnects if idle for 30s or more no matter what. It's using the following user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/93.0.4577.63 Safari/537.36 Can you make it possible to change the user-agent? Some sites block HeadlessChrome by default.
  8. One more feature request: can I get the complete webpage contents and not just the body element?
  9. @Ambimind - this is fantastic. Thank you! Can I request a feature? It looks like (according to the log) that it is re-requesting the same URLs repeatedly with each instance of a variable that makes a request. Is it possible to make Kuppet cache the results temporarily until it has moved on to the next Application? Since I gather a lot of content from some pages this could save me a LOT of repeat requests. Are there other switches than the port that it runs on?
  10. Hi, @jusseppe! I've had a chance now to try this myself and the biggest issue was that the alternate port I attempted to use would not work until I closed and reopened Ketarin. This made all the difference for me. I've pasted the exact text you'll want to use in the "global variables" feature below to make it easier to ensure it is correct. For the variable named "run_kuppet": if(!(PS kuppet -ea 0)) {START -WindowsStyle hidden BIN\kuppet 8008} Or, alternatively, use this to show the log as it happens: if(!(PS kuppet -ea 0)) {START BIN\kuppet 8008} For the variable named
  11. Note that modern browsers don't allow including an XSL file from file URLs, so you'll either need to put it on a website or load it with script.
  12. I recently made a change to the structure I use for all my variables and needed to publish it to my server. Since Ketarin doesn't (yet) support commands for non-updates I needed to get the data in a structured format from Ketarin to the database (I'll be passing it through Excel). Anyway... this is what I came up with. It uses Kryogenix sorttable, and some styling and stuff from a few other projects. You can use the aCols array to control which columns are hidden by default. For your own custom variables assign the variable name in one of these Variables/item[key/string='m
  13. Thank you, @Ambimind! This looks like it could be very useful.
  14. I know this is years late, but I was looking for something else and saw that one of the issues in this wasn't addressed. The problem with your troubleshooting line is most likely that you didn't quote the URL. This means that your script will crash before it actually echoes anything to the file if there's an ampersand (&) or certain other characters in the {preupdate-url} variable. Try this instead: echo TEST "{preupdate-url}" >> T:\xx.txt
  15. Hi, @Etz! They're using the Akamai Captcha, which is a javascript+cookie-based method of preventing exactly what you're trying to do. It looks like it should be possible to harvest sufficient information from the javascript and build the cookies manually, but it looks like it's going to be a nightmare to setup initially and a single change on their side will break it, so I wouldn't bother with it.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.