Jump to content
Ketarin forum

Protocol Violation errors: fix?

Recommended Posts

I'm working to resolve my buggy apps. Some work fine some days but not on others. Some issues are surely server throttling or automation/abuse prevention efforts.

A couple (Krita) stopped working recently with Protocol Violation errors. The specific error messages are:

The server committed a protocol violation. Section=ResponseHeader Detail=CR must be followed by LF
The contents of the URL can not be loaded. The server committed a protocol violation. Section=ResponseHeader Detail=CR must be followed by LF


I added the recommended app.config directive that's supposed to resolve this problem:

		<httpWebRequest useUnsafeHeaderParsing="true" />


Alone, it didn't help. Well, it actually changed the error to something else:

The contents of the URL can not be loaded: The underlying connection was closed: An unexpected error occurred on a send.


I changed the UA to wget and tried again. Now it works. There's probably a filter in place to allow wget to bypass whatever abuse headers are being sent. Or maybe Ketarin behaves differently when a UA is assigned.

This app hasn't successfully worked since August, so this change is a "fix" of sorts, but it's a strange one, and the next release of Ketarin will probably overwrite my app.config so I will have to fix this again to allow it to get around this problem.

I tried removing the new app.config stuff and just changing the UA to wget and it failed again with the protocol violation error. Added it back in and, along with the UA change it appears to consistently succeed now.

I know that allowing unsafe headers is not ideal, but since we're dealing with servers that have cache control and abuse features in place, it is likely the only way to bypass these techniques.

I'm open to alternative fixes if anyone has suggestions.

Link to post
Share on other sites

Sure! Here's a minimal Krita x64 to demonstrate the issue:

<?xml version='1.0' encoding='utf-8'?>
  <ApplicationJob xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" Guid="d145b6eb-736a-47bf-b546-66b12e20f371">
    <UserNotes />
    <DownloadDate xsi:nil="true" />
    <HashVariable />
    <HttpReferer />
    <SetupInstructions />
    <ExecuteCommand />
    <ExecutePreCommand />
    <FileHippoId />
    <Name>Krita (x64)</Name>

With useUnsafeHeaderParsing disabled it fails. I suspect the site is using Incapsula to prevent "bots" from scraping the site. The previous link shows that Incapsula does this intentionally to break some bots even though it really is this easy to get around it. I was wondering if you'd be interested in either globally enabling useUnsafeHeaderParsing, or exposing a per-app setting to allow this behavior that could be implemented via Reflection. This would provide safe header parsing wherever possible and only be enabled where needed. 

Coincidentally, implementing this change in app.config actually reduced the number of errors I was getting from  ~104 to ~80 - so it has fixed several of my other apps as well. I wish I would have kept better error logs though so I could find out which ones without having to turn it back off and re-trigger the errors.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.