Posts

Showing posts from October 13, 2018

Change permisions of a file with my cat's help

Image
Clash Royale CLAN TAG #URR8PPP up vote 6 down vote favorite 1 I realize the question title probably sounds weird, theoretical, or just plain stupid. There's a long backstory to this! root@system:~# less myfile -bash: /bin/less: Input/output error The root filesystem is dead. But my cat is still alive (in my memory): root@system:~# cat > /tmp/somefile C^d root@system:~# He's kind of lonely though, all his friends are gone: root@system:~# mount -bash: /bin/mount: Input/output error root@system:~# dmesg -bash: /bin/dmesg: Input/output error root@system:~# less -bash: /bin/less: Input/output error root@system:~# chmod -bash: /bin/chmod: Input/output error The system is still running, and fulfilling its purpose. I know, I know, the only sane response to this is to get the system down and replace the root drive. Unfortunately that's not a option as it would cost a lot of time and money. Also, it would kill my cat, and that would make me sad. I've thought of br...

Setting the batch size via Bulk API

Image
Clash Royale CLAN TAG #URR8PPP .everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0; up vote 1 down vote favorite I'm using the Bulk API to make updates to ~50k records. I'm splitting those up into jobs of about 5k apiece, but I was hoping there'd be a way to control the batch size similar to that of using Dataloader (ie: you can tell it to process in chunks of 200 records at a time, or 1 record at a time). Is this configurable? Or if not is there a different way to achieve this? The main issue is that some of the jobs fail due to a record triggering too many workflows that bog down the execution time, so I'd rather try and isolate those and use smaller batch sizes. api bulk-api share | improve this question asked 3 hours ago Matt 525 4 13 add a comment  |  up vote 1 down vote favorite I'm using the Bulk API to make updates to ~50k records. I'm splitting those...