Historically, to explore big data, you’d need to write statistical analysis scripts in R, create machine learning algorithms in Python, and/or manually filter data in a spreadsheet. If you have the data science expertise and the time to do this, then manual analysis is still an option.
Today’s visualization software and BI tools make it easy for you to integrate disparate data sources and perform advanced techniques such as regression, univariate, bivariate, multivariate, and principal components analysis. These tools also allow you to monitor your data sources, collaborate with others, and share findings in interactive data dashboards. The best tools even integrate AutoML capabilities which simplify building custom machine learning models.
You can also consider using open source tools such as Pentaho, R programming, Knime, NodeXL, RapidMiner, and OpenRefine. Be aware that open source software can be tricky to set up and use and it can expose you to security risk.
Associative vs Linear Exploration.