You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Adding to post-release epic as it's probably not a priority right now.
We need to determine how we want to handle large datasets, say anything over 2GB, e.g. the Airline dataset is 81GB. Relevant discussion started here. This would affect our CI in dax-schemata (related issue: CODAIT/dax-schemata#9).
Investigate Pandas and other packages for their abilities to exchange data on the hard disk.
The text was updated successfully, but these errors were encountered:
edwardleardi
added
the
core
Enhancements or bugs related to the core without which PyDAX can't perform its minimal functionality
label
Dec 17, 2020
Adding to post-release epic as it's probably not a priority right now.
We need to determine how we want to handle large datasets, say anything over 2GB, e.g. the Airline dataset is 81GB. Relevant discussion started here. This would affect our CI in dax-schemata (related issue: CODAIT/dax-schemata#9).
Investigate Pandas and other packages for their abilities to exchange data on the hard disk.
The text was updated successfully, but these errors were encountered: