Avoid double postings

Situation: I have a forum and in the last decade posting new entries was a a simple HTTP form created with the FormHelper. The forum is supposed to run on a cheap webhoster, so the reliability isn’t always the best. Sometimes the posting-form was send into the timeout/server-error Nirvana, which is of course frustrating.

In the future the form is send via JS in the background and not left before the server responses with an OK.

I also want to prevent double postings in case that the server received and executed the request (creating a new posting) but the response failed to reach the client.

I thought about a few schemes which could work, but I’m not really happy with any of them so far. The best I came up with: attach an UUID to a new posting form, send that UUID with the request and store it on the server, check if that UUID was used/exists when a new request comes in.

Any other ideas?

let canSend = true;

document.on('submit', 'form#my-form', function(){
  if(!canSend) return;

  canSend = false;

  // send your request here and if it's a success , flip canSend again
});

Thanks. That I do already including disabling the submit button, otherwise people with nervous fingers would create double-postings without end.

The question is about what comes later: The clients sends the request, it’s successfully executed by the server, but the client doesn’t receive the response for whatever reason.

So the client thinks it wasn’t successful even when it was. Of course in that case the client allows the user to submit again. Then there the potential double posting occurs.

Well… it’s an edge-case that shouldn’t really exist at all…
When somebody posts a comment on my website, it just sits there, waiting… maybe indefinitely… without allowing the user to re-send without refreshing the page.
Maybe you can add something like a timer to your request that maybe shows a toast “It is taking longer than expected… please hang on!”.

The best solution (and I know this may come as no surprise) is to get a decent webhoster…
webhosting is not very expensive anymore.
If your webhost is already taking so long to process one comment, then imagine if there are 100 people active at the same time…
A simple package at OVH costs between 2,50 and 6,10 euro a month and you get a decent performance over there.

“Cheap is expensive”

Thanks again for the input. Yes, that’s an edge case, not handling it isn’t the end of the world, but maybe there’s a good pattern out there I’m not aware of.

It’s OSS and I don’t control the installations people are using. I know that it was run on a free webhoster, and that was rough experience server-wise.

The performance is great on 8 bucks or up, but even then, there still might be network issues. The likelihood is going to spike for a single client if the user is e.g. a passenger in a train/car with spotty on and off connection.

Not allowing a resubmit is a technically solid but not exceptionally desirable solution from a user experience point of view.

you can use PRG pattern

there are a cake example used in the cake search plugin

The PRG pattern is applied. In my case I want to protect the user from something going wrong on the P and have them sit in front of a white page losing the content they were working on for the last twenty minutes.

The question is: In case the P went through but R doesn’t reach the user, what do you do? Esp if you want to give the user a chance to initiate another P assuming that a failing P is the 99% error case.

What about checking the latest postings of the user and simply compare if they are equal on server side? If they are: send the redirect/get again
if they are not: save them and send the redirect/get

This way you simply don’t need to care whether a user submits the same request multiple times - for whatever reason…

That was an option I considered my thoughts were these:

That means input fields have to be locked, so the user isn’t able to change the posting. The obvious way is to set the disabled HTML-attribute. There’s only one problem: the last line of defense if a user isn’t able to reach the server is to Copy&Paste the content out and store it locally. And of course you can’t C&P from disabled input fields.

So you have to do a) do some Javascript magic on the client to keep the fields immutable but accessible and b) some comparing on the server. Possible, but overall it did feel a little bit hacky.

I never rely on JS alone. This can easily break or be worked around.
I also use code side check here for my posts (in a forum board).

There is a validation rule that just checks that for this user (and maybe this thread) there is no such exact message already on the top of the stack (order desc limit 1). This works quite well, as you usually never post the exact same thing twice after each other - this is the definition of double post :slight_smile:

Thanks for all the ideas. :slight_smile:

PS: I’m still somewhat undecided and omit a strict server-side check for the moment. If someone wants to probe the implementation (login: test/test).

@Schlaefer

I checked. Looks like it’s working.