Skip to content

Latest commit

 

History

History
94 lines (56 loc) · 5.45 KB

UserGuide.md

File metadata and controls

94 lines (56 loc) · 5.45 KB

IAA User Guide

After installation of the IAA in your Snowflake Account. It's time to analyze your workload.

You'll find the landing page of IAA.

alt text

Explore my executions

How to upload your SMA output

You can get the SMA in the following link. if you need more information on how to run the SMA, find the details in their documentation.

  1. Run Assessment in your code.
alt text
  1. Click on View Results
alt text
  1. Navigate to the output folder to get the Zip File AssessmentFiles_*.zip

  2. Go to your Snowflake account, the one for the deployment, and navigate to the SMA_EXECUTIONS stage. By clicking

Data > Databases > [Name of the Database of deployment] > Stages > SMA_EXECUTIONS
alt text
  1. Upload your AssessmentFiles.zip* into the stage.
alt text
  1. Navigate to your Interactive Assessment Application
alt text
  1. Click Reload Interactive Assessment Application
alt text
  1. This data will take about 30 seconds to load, you can reload the page.
alt text

Explore your execution

In left side menu you will find the different aspect of the execution to explore.

  • Execution: This is a brief summary of the selected executions. with a few key metrics like Spark API, Third party API, and SQL readiness score. Here you can select a single or multiple executions.
alt text
  • Inventories: This section provides exportable raw files (xlsx format) about topics from notebooks to IO file operations.
alt text
  • Assessment Report: Download the SMA report.

  • Code Tree Map: It shows the distribution of files in a graphical tree map. The size of the squares represent the size of the code files.

alt text
  • Readiness by File: It shows the breadown by file and the readiness for migration.
alt text
  • Reader Writers: Allow you to understand which files in your project are reading or writing.

  • Dependencies: This section provides information related with your project depedencies, it includes internal dependencies and external one.

alt text

Explore the compatibility between Spark and Snowpark

This section of the App provides insights of the current capabilities of SMA for different kind of APIs. This is not specific to your execution.

alt text

Api Module Mappings

This will give you information of the current support for libraries including programming language Built-ins or third parties. In the category dropdown you can filter by the kind of library.

alt text

Spark, PySpark and Pandas API Mappings

The Spark libraries refers specifically those for Scala/Java, and their equivalent in Snowpark. In the case of Pyspark Libraries they are exclusive for Python.

alt text

How to read this information

In the image below we see 4 different columns.

  • Category: Is the name the different groups
  • Spark/PySpark Fully Qualified Name: This is the full name of function, class, method for the Spark API.
  • Snowpark Fully Qualified Name: This would be the equivalent function, class or method for Snowpark.
  • Mapping Status: This shows how the SMA will treat each of the Spark/Pandas for more details on meaning of the status please check SMA documentation.