In .NET eco-system we are used that memory is managed for us. It is allocated and freed up when not in use anymore. We do not have to worry about it. But even in .NET we need to think about such things like memory and external resources.I received a complain from one of our customers, that the memory usage of IIS goes to extreme. In addition the following exception was thrown:
Insufficient winsock resources available to complete socket connection initiation. An operation on a socket could not be performed because the system lacked sufficient buffer space or because a queue was full 127.0.0.1:8028
We started with attaching dotMemory profiler to the IIS process. Here is the screenshot of process's memory:
In order to understand what kind of objects are not being released, we took two snapshots during the profiling: one in the beginning of our profiling session and another one at its end. Then we compared those snapshots. From the comparison we selected Grouped the objects b namespace and then right-click and "Open Survived Objects". Survived objects are those that weren't released between the snapshots.
In the list of survived objects, we grouped them by Dominators and got the following picture:
From the above picture we understand that instances of TranparentProxy are not closed. It looks very connected to the "insufficient winsock resources" error we receive.
So now to the easy part: fix the code :-)
We wrapped any service call with a try/finally block. In the finally block we close the open proxy, which also closes the socket.
After applying the fix we executed dotMemory profiler to see how memory behaves. Here what we got:
The picture looks completely different: instead of steady increase, we see those drops in memory usage. The drops are of course result of Garbage Collection.
UPD: As @volebamor mentioned in his tweet: "it's more obvious to open new objects instead of survived". Thanks for your comment!