Panda Crazy Script for Panda's

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
Johnnyrs @Johnnyrs - That worked for the alert. Will need to update the link to open job on the new worker site to type in the captcha as well, its opening the www site even though the alert popped on the worker site.
Thanks a lot! I'll update the script so it can detect captcha on both sites and open the link on the correct site.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
Hi. First, I want to say thanks for creating such a program. I'm still pretty new to Turking, but this script is awesome. I wish I had the knowledge and skills to make things like this. I hope you're proud. Second, I wanted to ask if there was a way I could sync my jobs between two computers. I tried to export from one to the other thinking it will only add the ones it doesn't have or let me choose, but it took off everything and cloned the import. I go back and forth between a desktop and a laptop, so I don't always have them running the same jobs and I was looking for a way to sync the ones the other is missing without having to manually add each of them. Is there a way I can do that?
Yes the import deletes the jobs or alarms and then replaces them with the imported data only. I never thought about a sync but this could be a good future feature. It will only add new jobs from the imported file but not delete anything. Having it delete jobs that aren't in the import file could get tricky. So for now you can't do it but you can try to always export a file every time you switch computers. I'll think about how to get a sync working for the future.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
Today I released a new update for Panda Crazy Queue Helper (0.3.0) to greasyfork. Be sure to update it if greasemonkey/tampermonkey doesn't update it automatically. Should update it automatically sometime today. This is only for the Queue Helper and the features will only work with the newest Panda Crazy (0.5.0)

New Features:
  • New option to go to next hit with same title as current hit is now in the queue menu.
  • Going to https://www.mturk.com/mturk/myhits?JRPC=nexthit will go to the next hit in your queue that isn't opened in another tab. This can also be used for the link showing all queue hits. There is a delay of around 1 second when using JRPC=nexthit because it allows tabs to communicate what hit they are doing and lowers the chance of PRE's. This feature should work without Panda Crazy running too.
  • Having multi tabs opened should communicate a bit faster between each other so the next hit is ready.
  • Reworked the message passing between scripts to use less memory and be less messy.
Bugs Fixed:
  • Fixed it so you shouldn't get an error about already doing a hit after you finish your queue.
  • When accepting a hit the script should tell Panda Crazy the hit has been added correctly.
  • Script shouldn't send a hit submitted message to Panda Crazy when you just browse away from a hit now.
This script will not work on the new worker.mturk.com site. I'll be working on that later.

There was a small fix to the Panda Crazy Helper script today as well to make it read the Projected Earnings from the new MTS update. It will only pass that info to PC on the dashboard page.
 
Last edited:

Kadauchi

Well-Known Member
Master Pleaser
Crowd Pleaser
Joined
Jan 12, 2016
Messages
7,098
Reaction score
21,951
Points
1,263
Thanks! So that is working on the new site but it probably breaks detection on the old site right?
If so try replacing that line with:
Code:
if (htmlDoc.getElementsByName("userCaptchaResponse").length || htmlDoc.getElementsByClassName("captcha-image").length) {
Tell me if that works on both sites and I'll get it fixed in a new update.
I use this to detect catpchas on both.
Code:
if (document.querySelector(`img[src^="https://opfcaptcha-prod.s3.amazonaws.com/"]`))
 
  • Like
Reactions: A6_Foul_Out

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
I use this to detect catpchas on both.
Code:
if (document.querySelector(`img[src^="https://opfcaptcha-prod.s3.amazonaws.com/"]`))
Thanks! I'll try that out. Usually I don't use querySelector because it's a bit slower but for this it probably will take the same time or faster.
 

Kadauchi

Well-Known Member
Master Pleaser
Crowd Pleaser
Joined
Jan 12, 2016
Messages
7,098
Reaction score
21,951
Points
1,263
Thanks! I'll try that out. Usually I don't use querySelector because it's a bit slower but for this it probably will take the same time or faster.
Yeah, I only mentioned it because it might be faster than running 2 getclesses, but I didn't bother with a jsperf for anything.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
Today I released a new update (0.5.1) to greasyfork. Be sure to update it if greasemonkey/tampermonkey doesn't update it automatically. Should update it automatically sometime tomorrow. Be sure to backup your data by exporting it from the jobs menu at top so you won't lose anything if something goes wrong.

New Features:
  • Script will unpause automatically after you close the popup window for filling in captcha or logging back in.
  • Search mode jobs should be a bit faster after last update.
Bugs Fixed:
  • Captchas should get detected on the new worker site.
  • Max and min price search options should now save correctly.
This is mainly a fix for the last update. I can't test captchas with my account so if it doesn't work I can try another way. Also the Queue Helper script was updated today and it may fix some problems with finding the next hit and figuring out the current hit's position. Hoping to get the helper scripts to work on the new worker site after I am sure this script is working as it should.
 

turkenator

━╤デ╦︻(▀̿̿Ĺ̯̿̿▀̿ ̿)
Contributor
Joined
Feb 5, 2016
Messages
187
Reaction score
328
Points
388
Gender
Male
Occasionally, JR Mturk Panda Crazy Queue Helper script breaks working batches from the checkbox somehow.

It causes the error "URI too large"

Somehow the URL ends up getting something like "prevRequester=XXXXXXXXXXXXXX" repeated over and over at the end. Not sure how or why this is happening. Definitely stops if I turn off queue helper.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
Occasionally, JR Mturk Panda Crazy Queue Helper script breaks working batches from the checkbox somehow.
It causes the error "URI too large"
Somehow the URL ends up getting something like "prevRequester=XXXXXXXXXXXXXX" repeated over and over at the end. Not sure how or why this is happening. Definitely stops if I turn off queue helper.
Thank you very much. Didn't realize that was happening because I haven't worked from batch like that to test it. I have a fix for it and I will update it sometime today. Found some other bugs while I was fixing this one too.
 
  • Like
Reactions: turkenator

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
Today I released a new update for Panda Crazy Queue Helper (0.3.2) to greasyfork. Be sure to update it if greasemonkey/tampermonkey doesn't update it automatically. Should update it automatically sometime today.

New Features:
Bugs Fixed:

  • Fixed a bug where it would add extra info at the end of the nextaccept link when working off a batch.
  • Fixed a bug where it would change the nextaccept link to go to the next hit in queue instead of leaving it alone when working off a batch.
  • Fixed a bug where it didn't grab the hitid when working off a batch.
  • Fixed a bug where it didn't set the timeleft data correctly and it made things look wrong.
  • It should update Panda Crazy Queue watch correctly when working off a batch.
  • Will show position 0 in title if queue data hasn't been updated yet.
 
  • Like
Reactions: Peachy and Barbwire

JrRandy

New Member
Joined
Apr 3, 2017
Messages
9
Reaction score
0
Points
201
Age
46
I can't test captchas with my account so if it doesn't work I can try another way.
Its working on both sites :) I am getting PRE's on the www. page that I dont think I should be getting, but I cant decipher what is happening. I turned debugging on, but the console doesnt show the expected "got a PRE for a search or pantha on: " + finalUrl. They are all listed as Normal PRE's, I have even turned it down to 1 request every 1000ms and I still see them periodically. Is there any other causes for the PRE count to increase besides true PRE's?

~edit~ Rolled back to 0.4.4, and still occuring. Maybe mTurk fixed it so Worker and WWW share PRE count lol
 
Last edited:

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
Its working on both sites :) I am getting PRE's on the www. page that I dont think I should be getting, but I cant decipher what is happening. I turned debugging on, but the console doesnt show the expected "got a PRE for a search or pantha on: " + finalUrl. They are all listed as Normal PRE's, I have even turned it down to 1 request every 1000ms and I still see them periodically. Is there any other causes for the PRE count to increase besides true PRE's?

~edit~ Rolled back to 0.4.4, and still occuring. Maybe mTurk fixed it so Worker and WWW share PRE count lol
Well I know that the worker and www site have separate PRE counts because I am using both without getting too many PRE's today. I actually have only 10 PRE's on the www site and 0 on the worker site. I do recommend not using the www site and new site at the same time on the same browser. There is a bug on mturk that will cause hits not to be submitted sometimes because of some cookie thing they set. I use regular chrome for www site and worker on chrome portable. Also remember if you are using hit finder or other scripts be sure to know what site you are using them on so you know what your timer should be to accommodate other scripts. I also don't recommend to work off the queue from the new site because not all hits work there and it doesn't advance to next hit without using a script. Also check to see if you have auto slowdown stopped. Now the script remembers what you have it set at so that could cause PRE's if you don't let it slowdown automatically again. Best way to test things is turn off all scripts and turn on 1 by 1 after a few minutes to see which one is conflicting and then decide what the timers should be at.

-- Also be sure you see text that says "-NEW SITE-" at the top after the help button if you are using it on the new site. If you don't see that then it's still using the old site.
 
Last edited:

JrRandy

New Member
Joined
Apr 3, 2017
Messages
9
Reaction score
0
Points
201
Age
46
Well I know that the worker and www site have separate PRE counts because I am using both without getting too many PRE's today. I actually have only 10 PRE's on the www site and 0 on the worker site. I do recommend not using the www site and new site at the same time on the same browser. There is a bug on mturk that will cause hits not to be submitted sometimes because of some cookie thing they set. I use regular chrome for www site and worker on chrome portable. Also remember if you are using hit finder or other scripts be sure to know what site you are using them on so you know what your timer should be to accommodate other scripts. I also don't recommend to work off the queue from the new site because not all hits work there and it doesn't advance to next hit without using a script. Also check to see if you have auto slowdown stopped. Now the script remembers what you have it set at so that could cause PRE's if you don't let it slowdown automatically again. Best way to test things is turn off all scripts and turn on 1 by 1 after a few minutes to see which one is conflicting and then decide what the timers should be at.

-- Also be sure you see text that says "-NEW SITE-" at the top after the help button if you are using it on the new site. If you don't see that then it's still using the old site.
Yeah, it seems to be a conflicting script issue. Just a matter of figuring out which one it is. I disabled all scripts except Panda Crazy, no PRE's. Re-added Queue Helper, no pre's. Will continue to add 1 at a time until I find the culprit :) As for www and worker, I have them running on 2 seperate machines, so no issue with conflict there :)
 

Dawn

New Member
Joined
Aug 3, 2016
Messages
2
Reaction score
0
Points
1
Age
54
Gender
Female
Hi,

I'm having trouble with PC in Chrome. It will not stay running, I've turned off the background tab throttling thing and it still will not stay running unless I am on that tab. I tried both 5.1 and 5.0 and neither is working right. They were working, I am just not sure when they stopped working, just in the last few days I think. Help?
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
54
Location
Whittier, California
Gender
Male
Hi,

I'm having trouble with PC in Chrome. It will not stay running, I've turned off the background tab throttling thing and it still will not stay running unless I am on that tab. I tried both 5.1 and 5.0 and neither is working right. They were working, I am just not sure when they stopped working, just in the last few days I think. Help?
It's recommended to run PC in a window of it's own so it won't be slowed down by browser security features. I haven't tested the disabling the background tab throttling so I have no idea if it works or they might decide to have it not work so I can't recommend it. It's easier just to pull the PC tab out into it's own window and keep it in the background under other windows so you can also see the queue watch if you have a big screen.
 

AJ1

New Member
Joined
Apr 29, 2017
Messages
1
Reaction score
0
Points
1
Age
59
Hi there, I went to get this script from the site you indicated up top and a message appeared way up top of the Greasy Fork page that user scripts could not be added from that website. I wonder if it has to do with the https? Or what? I'd really love to try this out. Thanks!
 

jan

Moderator
Moderator
Joined
Jan 12, 2016
Messages
26,586
Reaction score
51,066
Points
1,463
Gender
Female
Hi there, I went to get this script from the site you indicated up top and a message appeared way up top of the Greasy Fork page that user scripts could not be added from that website. I wonder if it has to do with the https? Or what? I'd really love to try this out. Thanks!
what browser were you using?
 

sinon

Watashi ga Kita!
Contributor
Joined
Jan 12, 2016
Messages
3,014
Reaction score
7,112
Points
838
Gender
Female
Logged into PC today and everything is outlined in red? Is this a new change or did something happen lol
 

Dawn

New Member
Joined
Aug 3, 2016
Messages
2
Reaction score
0
Points
1
Age
54
Gender
Female
It's recommended to run PC in a window of it's own so it won't be slowed down by browser security features. I haven't tested the disabling the background tab throttling so I have no idea if it works or they might decide to have it not work so I can't recommend it. It's easier just to pull the PC tab out into it's own window and keep it in the background under other windows so you can also see the queue watch if you have a big screen.
That does not work either. It will not stay running at all. It used to work, but now it will not, even though I have turned off tab throttling.
 

catnapped

Relatively Unknown Member
Contributor
Crowd Pleaser
HIT Poster
Joined
Jan 13, 2016
Messages
20,908
Reaction score
43,057
Points
2,738
Age
52
Location
Pennsylvania
Gender
Male
I've got several issues:

a) on the other computer if I run the queue helper I can't return anything from the queue page (I get a page not found error or something). Turn the script off and it works again. Not sure if the issue affects this computer. *edit: confirmed it's happening on all my computers and other people have had the same issue*

b) I've been tossed back to the queue page numerous times on a batch (as opposed to it going in sequence). Not sure what's causing that.

c) bigger problem is even running a panda (batch) on 2000+ ms and I'm still getting PRE'd up the wazoo. Never mind running it at a shorter timespan than that and I can forget about running more than maybe one panda at a time without having issues with PRE's. Can't for the life of me understand how people can run 15-20 pandas and still manage to work. That's just impossible for me.
 
Last edited:
  • Like
Reactions: Jaded