Want to perform geotemporal analysis on the movements of a fleet of vehicles. New to graph databases, I have watched the webinars on temporal and geospatial analysis, and if I understand correctly, I would need a vertex for each (geogrid, time) point that a vehicle passes through. At a million points per hour that is about 9 billion vertices per year of data, with at least three times as many edges. Is that correct, if so what would that translate to in memory and disk requirements for TigerGraph? Most queries would be on current day’s data but I would need to be able to pull up a particular vehicle for a given date and time up to a year back. How much data would be held in memory vs. stored on disk? Again, totally new to graph databases and at this point am trying to figure out if feasible based on hardware requirements.
Welcome to the TigerGraph Community!
Your problem is definitely what we call a “graph shaped problem”, so it would perfectly suit TigerGraph’s capabilities. And capacities as well: we have database instances handling more data than what you predict.
Without knowing more about your data it’s difficult to estimate compute and storage capacity requirements. We typically like to use a sample data set, create a provisional schema, load the data and look at stats, so that we can provide a acceptably accurate estimation.