Current location - Loan Platform Complete Network - Big data management - Large data volume mapping, how to improve the speed
Large data volume mapping, how to improve the speed
Step 1: Data filtering

Method 1: Make a MAP mapping to filter out identical points.

Method 2: Make your own rules for data filtering (e.g., after the decimal point, after the 5th digit, no longer listed as a new point)

Step 2: Display Algorithm Streamlining

Generally use the concept of double caching, but really understand the double caching, that is, don't re-drawing all the steps in the initialization of the DC after the save

I once streamlined 3 million rows of data, finally simplified to 110,000 points, the amount of rendering only 7-8K, in addition to the first time to filter data and rendering is relatively slow, after that browsing, zoom in, zoom out, moving, dragging and dropping are very smooth