Recently while running a PowerShell script, I started having some problems that had never happened before. I hadn’t made any changes to the script, but the section of the script that remotes into a SharePoint server and updates a list quit working for some reason. The script was just hanging there, and would eventually time out on that section and move on. Luckily, PowerShell threw an error at me to give me some idea of what was happening.
Processing data for a remote command failed with the following error message: Not enough storage is available to complete this operation. For more information, see the about_Remote_Troubleshooting Help topic.
“Not enough storage is available”? The first thing my brain did was flashback to NT4 Service Pack 3. Do I need to set the IRPStackSize registry key? Probably not, and I was really surprised I remembered that issue, it’s been a decade or so.
After some Googling with Bing, I discovered that by default, WSMan allocates only 150M of memory to each remote shell. You can verify that by running this one-liner:
That should return 150 if you haven’t made any modifications. My script wasn’t putting that much data into SharePoint, but I figured this looked like a likely culprit, and hey, memory is cheap these days. So I bumped that 150M up to 1G.
Set-Item WSMan:localhostShellMaxMemoryPerShellMB -Value 1024
After changing that setting and running the script again… Bingo! The SharePoint data was successfully input and the script was fully functional again. I’m not sure what might have caused that to happen as nothing had changed in the script or on the server, but as long as it’s working, I’m happy.