There are approaches to efficiently performing the transfer and ensuring performance and data integrity. The choice of method is contextual to the architecture of the data involved. Hence, your best options and concerns are listed below.
1. Using Power BI APIs & DOMO APIs
Via the REST Power BI API: The ExportReport or Dataset API can be used to extract dataset data.
Import in DOMO Data: Utilizing the DOMO DataSet API to push Power BI data into a DOMO dataset.
Automated: You can automate your data transfers with either a Power Automate flow or a Python script (request method or promo).
Pros: Flexible, automated, and scalable.
Challenges: API development and authentication handling.
2. Export Data and Use DOMO Workbench
Export from Power BI to CSV/Excel: You can export data from Power BI and upload the corresponding files to DOMO with Workbench.
Automate with Power Automate: Schedule automatic exports from Power BI and push them to a DOMO data source.
Pros: Simple, no coding on your part.
Challenges: Introduces manual steps in case this is not automated and may run into file size behavior.
3. Intermediary via a Cloud Data Warehouse
Power BI exports will go to Azure SQL, Google BigQuery, or Snowflake (if they are already connected to Power BI).
Connect to the cloud warehouse(Domo) via native connectors.
Pros: Very good for large datasets, maintains data integrity and real-time sync.
Challenges: It requires cloud storage setup and extra costs.
4. Third-party ETL Tools: Best for Large Transfers.
Tools such as Fivetran, Matillion, or StitchData can efficiently transfer Power BI datasets to DOMO.
Allow for incremental updates so as not to re-load already existing data.
Pros: Fully automated, trustworthy, and with scheduled refresh support.
Challenges: Licensing costs with ETL tools.