NOTE: Only hiring agencies which can provide long term maintanence if the tool breaks. Don’t apply if you’re an individual.
Goal of this project: To make a database which can handle millions of rows with import and quick filter function
We are looking to build a large database of websites with more than 1 million rows and 3 to 5 columns. Currently we are using excel however it seems to be slowing down quite a lot even on a decent spec system.
**Checklist of features to be added in the tool.**
– [ ] Ability to add new columns and rows
– [ ] Auto removal of duplicates from the column(s) of choice (important step)
– [ ] Must load FAST enough without much lag.
– [ ] Should be able to import new data through CSV import
– [ ] Can we filter out the database using Filter function?
– [ ] Ability to search a particular domain from the list to replace something through CSV function(This feature is needed when we would like to update email ID of a certain site)
– [ ] Parsing the whole database to match and import the email ID’s for a specific domain URL through csv import
– [ ] Ability to delete particular rows/domains
**NOTE: We won’t need to load all of the rows at once. The main thing which needs to be done is to remove duplicates and to keep the database clean and updated**
### These are the columns we would need to add (There should be an option for row and column addition
**Website URL | Traffic | Email ID | Category | Last Modified (***This is the date on which this particular row was updated***)**
Hourly Range: $20.00-$60.00