🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

receiving files over 2 GB through sockets c++

Started by
5 comments, last by a light breeze 3 years, 9 months ago

I am wondering how would i do that? Make an array with size of the file in memory? open a file and append (with fwrite) till its done then closefile?

What would be the best way?, so i wont damage hard dive (either sata, ssd, memory stick).

I rather dont want to fill whole buffer in memory.

I need this feature to run on windows and android.

The basic question is what would damage the disk when doing this kind of stuff - these are only my thoughts i came with the idea that if i will be constantly writing to the buffer till its downloaded i would damage the storage.

Same for reading is it safe to fopen and keep it open iterating with fread till its sent?

Advertisement

_WeirdCat_ said:
What would be the best way?, so i wont damage hard dive (either sata, ssd, memory stick).

I've never heard of even a virus being able to physically damage a hard drive. I guess it's possible?

I hope there are safety measures built into most operating systems (ie the ones I use) to eliminate file damage caused by bad programmers. (not calling you a bad programmer)

🙂🙂🙂🙂🙂<←The tone posse, ready for action.

yeah bud bad read/writing behavior for a long peroid of time just worries me, and yes thats about bad programming ?

2GB is't much though, there is maybe an argument to be had kept a consumer drive running 24/7 without idle, but even then I believe the things are actually pretty reliable (but actual NAS/Server drives more so).

That said what is the data? If its basically a file download I'd be inclined to leave that too existing solutions. Use say HTTP with Apache/Nginx/IIS on the server, and a pick of the various HTTP client solutions. And take advantage of range quests (Range request header, 207 Partial Content response with Content-Range header), no one wants to start over on a slow internet connection (or server) if the socket disconnects at 1.9GB.

For download resuming you can take the size of the file downloaded/written so far then use an open ended range, such as Range: bytes=1000-.

EDIT:

fleabay said:
I hope there are safety measures built into most operating systems (ie the ones I use) to eliminate file damage caused by bad programmers. (not calling you a bad programmer)

Like what though? On most systems if a program wants to overwrite/break one of that users file nothing much will stop them, unless that user had some sort of backup enabled. How is the OS meant to know between Word saving a docx and your broken download program trashing one?

What the OS does generally do is protect the overall file system though, you shouldn't be able to create/remove/rename/delete/etc. your way into a broke filesystem, and getting direct block level access should need root/admin (probably more because it bypasses any file/folder level permissions).

I never heared about disk damage just by continous writing data to a drive, that would meamn that every database is a potential hardware killer, very unlikely to me!

If you have large files to read/write from/to disk, think of using Memory Mapped I/O so OS manages the caching and I/O of those files for you. Then depending on the protocol you use when sending those files, wait until all data has been received and close the Memory Mapping.

I suggest to split the data into smaller packages/messages you send to the client if possible

Just write your data and let the OS worry about disk caching. Yes, making lots of small writes to a SSD is bad (because to write anything at all the hardware always has to write a full block), but every OS already performs disk caching. Or write in chunks of 1KB or so if you're really worried.

Far more important is being able to deal with interrupted downloads. Given a bad enough internet connection (like the one I had only a few years ago), 2GB can take more than a day of downloading, with multiple interruptions. I remember having trouble downloading files as small as 50MB because the connection kept timing out before the whole file was downloaded, forcing me to start the whole download over. Not fun.

This topic is closed to new replies.

Advertisement