[+] The "Text File Input Filters" have been added to enable processing of CSV / text files of any size with any number of rows regardless of the amount of RAM in a computer. Filters can use RegEx patters, the popular relations, ranges and fuzzy filters.
https://citadel5.com/help/gsbase/open_text_file.png
For example, in GS-Base you can load this (publicly available) NYC Yellow Taxis data set: https://www.kaggle.com/datasets/elemento/nyc-yellow-taxi-tri...
(1.9GB, ~12.7 million rows) in ~20s on an average/old pc with 8-16GB RAM. The obtained sheet is loaded entirely to RAM and ready to be used for instant ETL actions, column calculations, pivot table calculations, aggregations etc. With the above 12.7 million rows, creating a pivot table with the "trip_distance" as row values and "passenger_count" as column values takes with the same computer setup around 45s.
With input filtering you can process NNN-gigabyte CSV files and bigger. No "cloud" limitations, even for old desktop PCs.
[+] Create tables with file listings from folders and entire disks. Scan and monitor file changes on your disks; find changed files, new files and deleted files. Add and maintain any type of file metadata for each file; search the automatically created history of file changes for each file; search names/paths using regex, search for duplicates, fuzzy searches. Filter full paths, folders, file names, modification dates, access dates file sizes. Examples:
Creating and verifying lists of files on disks: https://citadel5.com/help/gsbase/ver_files.htm
Filtering collections of photos by EXIF tags, searching for jpg/mp3/mp4 file duplicates: https://citadel5.com/help/gsbase/formulas_duplicate_files.ht...