Please be patient with me - I understand a decent amount about SQL queries (I've passed 20761) but not a HUGE amount of managing/developing SQL.
I have a regular set of 7 tables from a supplier that's refreshed every midnight, that I access via FTP. For the last few years, when I wanted to refresh our local dataset, I've had to download the data from FTP, convert it to excel, then use the Import/Export Wizard to update the sql tables (now hosted on Azure).omegle
A couple of these tables are becoming quite bloated now - after conversion, one of the files is 27MB, which, when uploading to Azure via the employer's VPN takes ages.
So my question is, is there a way to automate this process so Azure knows to grab the data from the FTP server every 24 hours and update the database automatically? I don't even mind going through the process of identifying the datatypes on every csv the one time to ensure it's a quick process in the future (the quickest part of using Excel is the sql tables pick up the existing format of the data).
I've heard I can set something up using Visual Studio (after the files have been converted to xlsx) but I know absolutely nothing of using VS and I honestly don't have the time to learn about it at the moment.
Thank you in advance for any help!