DISCLAIMER - I'm not a developer, I'm an idea's guy :-). I currently have a working system that i want to enhance, in order to do so i would employ the services of a freelance developer.
My question is to ask for advice so i can then ask / tell a developer what i need to do. If you can help or or interested in the project please let me know...
I am currently using Tshark as a simple scanning tool to record Wireless MAC's, RSSI and date / time stamp them. This is all done via a RPI.
At the moment the data is captured and sent to my server @ AWS via a Json script, this is done in real time as soon as scan results are received... In a busy environment it generates quite a bit of traffic :-)
I have an issue in there when i loose internet connectivity i also loose all my scan results. Although this isn't a massive problem i would like to resolve it. Originally i was thinking of capturing the scan results, send them to a local running instance of SQL and then replicate the data from SQL to my server... somehow. Then if i loose internet connectivity the local DB will continue to store the data and when connectivity is back up and running the DB will send the everything it has cached to my server.
However reading several threads alot of people are saying that its not a good idea to store data in SQL... not sure why. It also doesn't seem that easy to get Tshark to save its data to SQL without scripting or some people have said to pipe it directly.
What i would like to ask the community for is idea's on the best way to achieve my goal of a locally store copy of the data that can replicate to a main server but also when my RPI is online it all does it in as close to real time as possible as per the current setup.
Thanks in advance.
asked 27 Dec '16, 03:04