To address the large size of a Power BI (PBIX) file, which contains a huge amount of data, consider implementing the following suggestions:
Paying attention to the Data Model: Ensure that unnecessary columns/rows are removed and the most appropriate size of data types are employed (for instance, use date instead of DateTime). If possible, carry out data summarization, for example, using monthly rather than daily figures to reduce the number of rows.
Optimizing DAX and Relationships: Where possible, limit the use of calculated columns by applying calculations at the source or in Power Query instead. Calculated columns should be avoided in favor of measures when appropriate, and bidirectional cross-filtering should be used only when necessary.
Power Query Modifications: Use filters sooner rather than later in Power Query to narrow down the amount of data to be loaded (for instance, only data from recent years). For queries that are not in use, ensure that the data load is set not to include them, and lessen transformation stages to keep data reasonable.
Make use of VertiPaq Compression: Avert overuse of unique non-repeating fields such as an ID field. In other instances, this is limited as these columns compress less. Eliminate this when not required, however.
Sensei Refresh: For expansive datasets that are prone to future modifications, consider only foreign key refresh to load new sets of data, thereby minimizing redundancy and enhancing efficiency.
This will make your PBIX file more efficient and effective, resulting in better loading times and performance.