fullSnowflake SnowPro Core Certification Complete Exam
Snowflake SnowPro Core Practice Exam with Solutions Part 2
In the first part, we saw the first 50 questions. In this chapter, we’ll see the next 50, like in the certification exam. Best of luck both here and at the Snowflake SnowPro Core Certification test!
Remember that you can find this exam FOR FREE at FullCertified. Take it now with our exam simulator!
In case you want to refresh any theoretical concept, here you have all the chapters of the course:
QUESTIONS WITH SOLUTIONS
51: Which commands cannot be executed from the Snowflake UI?
- SHOW
- LIST <stages>
- GET
- COPY INTO
- PUT
Solution: 3, 5. These two commands cannot be executed from the Snowflake web interface; instead, you should use the SnowSQL client to GET or PUT data files.
52: Compute cost in Snowflake depends on…
- The query execution time.
- The query execution time and the waiting query time.
- The data warehouse size and for how long it runs.
Solution: 3. The cost does not depend on how many queries you run in the warehouse. It depends on which warehouse size and how long it runs.
53:What techniques would you consider to improve the performance of a query that takes a lot of time to return any result?
- Define partition keys
- Create cluster keys & turn auto clustering on the table
- Create an index on the search result
Solution: 2. Clustering keys are a subset of columns or expressions on a table designated to co-locate the data in the same micro-partitions. Data might become less clustered when we perform many DML operations on a table. To solve that, Snowflake also provides periodic & automatic re-clustering to maintain optimal clustering. These techniques improve the performance of queries that takes a lot of time, as Snowflake will analyze fewer micro-partitions.
54: If a transaction disconnects and goes into a detached state, which cannot be committed or rolled back, how long will Snowflake take to abort the transaction?
- 15 minutes
- 60 minutes
- 4 hours
- 12 hours
Solution: 3. If the transaction is left open or not aborted by the user, Snowflake automatically rolls back the transaction after being idle for four hours. You can still abort a running transaction with the system function: “SYSTEM$ABORT_TRANSACTION”
55: Which of the following services are managed by the Cloud Services Layer?
- Metadata Management
- Authentication
- Storage
- Infrastructure Management
- Query Execution
Solution: 1, 2, 4. The Cloud Services layer is a collection of services coordinating activities across Snowflake. It’s in charge of Authentication, Infrastructure management, Metadata management, Query parsing and optimization, and Access control.
56: What types of tables are available in Snowflake?
- Permanent
- Temporary
- Transient
- External
- Internal
Solution: 1, 2, 3, 4. You can see the differences between these tables in the following image:
57: What is the property name from the Resource Monitors that lets you specify whether you want to control the credit usage of your entire account or a specific set of warehouses?
- Credit Quota
- Monitor Level
- Schedule
- Notification
Solution: 2. The monitor level is a property that specifies whether the resource monitor is used to monitor the credit usage for your entire account or individual warehouses.
58: A medium (M) warehouse has auto-suspend configured after 15 minutes. You have noticed that all of the queries on this warehouse finish within a minute. What will you do to optimize compute costs?
- Delete the warehouse after a minute
- Reduce the auto-suspend time to 1 minute
- Use another data-warehouse
Solution: 2
59: Which command will you run to list all users and roles to which a role has been granted?
- SHOW GRANTS TO ROLE <ROLE>
- SHOW GRANTS OF ROLE <ROLE>
- SHOW GRANTS IN ROLE <ROLE>
Solution: 2. “SHOW GRANTS OF ROLE” will list the users, whereas “SHOW GRANTS TO ROLE” will list the privileges to which this role has access. Here you can see an example of running the command in my Snowflake account:
60: Which service does Snowflake use to provide the Zero-Copy cloning functionality?
- SSD Cache of the Virtual Warehouses
- Cache
- Metadata from the service layer
Solution: 3. Zero-Copy cloning does NOT duplicate data; it duplicates the metadata of the micro-partitions. For this reason, Zero-Copy cloning doesn’t consume storage. When you modify some cloned data, it will consume storage because Snowflake has to recreate the micro-partitions. You can see this behavior in the following image:
61: Which cloud provider is not supported by Snowflake?
- AWS
- Azure
- IBM
- Google Cloud Platform
Solution: 3. A Snowflake account can only be hosted on Amazon Web Services, Google Cloud Platforms, and Microsoft Azure for now.
62:Which statements are correct about micro-partitions in Snowflake?
- Contiguous units of storage
- Non-contiguous units of storage
- 50 and 500MB of compressed data
- 50 and 500MB of uncompressed data
- Organized in a columnar way
Solution: 1, 4, 5. This definition is a must, and we need to know it perfectly “All data in Snowflake tables are automatically divided into micro-partitions, which are contiguous units of storage between 50 and 500MB of uncompressed data, organized in a columnar way”.
63: Does GET support downloading files from external stages?
- True
- False
Solution: 2. If you want to download data from external stages, you need to use the cloud provider; for example, download them directly from AWS S3.
64: What are the different caching mechanisms available in Snowflake?
- Metadata cache
- Query result cache
- Index cache
- Table cache
- Warehouse cache
Solution: 1, 2, 5. The Metadata cache is maintained in Global Service Layer and contains Objects Information & Statistics. For example, when you execute the command COUNT() on a table, the result comes from this cache; that’s why it returns the information quickly. The Query Result cache holds the results of every query executed in the past 24 hours. If you repeat a statement and the underlying data hasn’t changed, it will use this cache. The last one is the warehouse cache, attached to the SSD of each warehouse. In this case, the information is lost when the warehouse is suspended.
65: Which Snowflake edition supports private communication between Snowflake and your other VPCs through AWS PrivateLink?
- Standard
- Enterprise
- Business Critical
Solution: 3. AWS PrivateLink is an AWS service for creating private VPC endpoints that allow direct, secure connectivity between your AWS VPCs and the Snowflake VPC without traversing the public Internet. This feature requires the Business Critical edition or higher. You can see the differences between the Snowflake editions in the following image:
66: What are the different stages available in Snowflake
- User
- Table
- Named internal
- Named external
- Account
- Schema
Solution: 1, 2, 3, 4. Snowflake stages are a big topic in this exam. External stages reference data files stored outside Snowflake, and internal ones Stores data files internally within Snowflake. The User stage (represented with “@~”) is a personal stage for each user, whereas the table stage is the one that each table has by default. You can see the different stages in the following image:
67: Is re-clustering in Snowflake only triggered if the table would benefit from the operation?
- True
- False
Solution: 1. DML operations (INSERT, UPDATE, DELETE, MERGE, COPY) can make the data in the table become less clustered. To solve that, Snowflake provides periodic & automatic re-clustering to maintain optimal clustering. It only reclusters a clustered table if it benefits from the operation.
68: When staging uncompressed files in a Snowflake stage, are the files automatically compressed using gzip unless compression is explicitly disabled?
- True
- False
Solution: 1. If you have already-compressed files, Snowflake can automatically detect any of these compression methods (gzip, bzip2, deflate, and raw_deflate) or you can explicitly specify the method that was used to compress the files.
69: In which of the below scenarios is SnowPipe recommended to load data?
- We have a small volume of frequent data.
- We have a huge volume of data generated as part of a batch schedule
- In both of the previous scenarios
Solution: 1. Snowpipe is a serverless service that enables loading data when the files are available in any (internal/external) stage. You use it when you have a small volume of frequent data, and you load it continuously (micro-batches).
70: You have two types of named stages, one is an external stage, and the other one is an internal stage. Will external stages always require a cloud storage provider?
- True
- False
Solution: 1. External stages reference data files stored in a location outside of Snowflake. Amazon S3 buckets, Google Cloud Storage buckets, and Microsoft Azure containers are the currently supported cloud storage services. You can see an example of it in the following diagram:
71: What influences Snowflake pricing? (SELECT ONLY ONE)
- Amount of data queried from Snowflake.
- Amount of data scanned from Snowflake.
- Snowflake pricing is based on usage & storage.
Solution: 3. Snowflake’s pricing model is primarily based on two consumption-based metrics: compute and data storage, and you can pay for them independently. The computing cost is based on the number of credits used to run queries or perform a service. The storage charge is based on the number of bytes stored per month and the cost of moving data.
72: What is the name of Snowflake’s Command Line Interface tool?
- SnowCLI
- SnowSQL
- SnowTerminal
- SnowCMD
Solution: 2. SnowSQL is the command line client for connecting to Snowflake to execute SQL queries and perform all DDL and DML operations, including loading data into and unloading data out of database tables.
73: After how many days does the COPY INTO load metadata expire?
- One day
- 14 days
- 64 days
- 180 days
Solution: 3. The information about the loaded files is stored in Snowflake metadata. It means that you cannot COPY the same file again in the next 64 days unless you specify it (with the “FORCE=True” option in the COPY command). You can see this behavior in the following image:
74: Select the term that is associated with the compute layer:
- Query optimization
- Query planning
- Query processing
Solution: 3. You can find the name of the layers in different ways, like Query Processing for the Compute Layer.
75: Which are the two metadata columns for staged files?
- METADATA$FILENAME
- METADATA$FILEFORMAT
- METADATA$FILE_ROW_NUMBER
Solution: 1, 3. The METADATA$FILENAME column is the name of the staged data file that the current row belongs to. The METADATA$FILE_ROW_NUMBER is the row number for each record in the container staged data file. This is a way of query the stage metadata:
SELECT metadata$filename, metadata$file_row_number from @MY_STAGE
76: A warehouse ran for 62 seconds, and it was suspended. After some time, it ran for another 20 seconds. For how many seconds will you be billed?
- 20 seconds
- 62 seconds
- 92 seconds
- 122 seconds
Solution: 4. You will be billed for 122 seconds (62 + 60 seconds) because warehouses are billed for a minimum of one minute. The price would be different if the warehouse wasn’t suspended before executing the second query.
For example, if we had only run a query, and it had only run for 62 seconds, you would be billed for these 62 seconds. If it had only run for 20 seconds, you would’ve been billed for 60 seconds.
77: Which type of data incurs Snowflake storage cost?
- Data Stored in permanent tables.
- Data Stored in temporal tables.
- Cache results.
- Data retained for Fail-Safe & Time-Travel.
Solution: 1, 2, 4. It also includes data stored in Snowflake locations (i.e., user and table stages or internal named stages). Cache results do NOT incur Storage costs.
78: How will you store JSON data in Snowflake?
- Using a column of the JSON type
- Using a column of the VARCHAR type
- Using a column of the VARIANT type
- Using a column of the NULL type
Solution: 3. Semi-structured data is saved as Variant type in Snowflake tables, with a maximum limit size of 16MB, and it can be queried using JSON notation. You can store arrays, objects, etc.
79: Are Snowflake caches automatically invalidated if the underlying data changes?
- True
- False
Solution: 1. If the data in the Storage Layer changes, the caches are automatically invalidated.
80: If you want a dedicated virtual warehouse, which is the lowest Snowflake edition you should opt for?
- Standard
- Enterprise
- Business Critical
- Virtual Private Snowflake
Solution: 1. In Snowflake, all the Virtual Warehouses are dedicated to the users. If you create a virtual warehouse, you will only be the one using it.
81: What are the usual data-loading steps in Snowflake?
- Source → Snowflake Stage → Snowflake table
- Source → Snowflake Table → Snowflake Stage
- Snowflake table → Source → Snowflake Stage
Solution: 1. A source system (like a web application) loads the data into a Snowflake Stage (for example, an external stage like Amazon S3). Then we can copy the data into a Snowflake table. You can see this behavior in the following image:
82: What are the two types of data consumer accounts available in Snowflake?
- Shared Account
- Reader Account
- Public Account
- Full Account
Solution: 2, 4. There are two types of data consumers. The first one is the Full Accounts, the consumers with existing Snowflake accounts. In this case, the consumer account pays for the queries they make. We also have the Reader Accounts, the consumers without Snowflake accounts. In this last case, the producer account pays all the compute credits that their warehouses use. You can see this behavior in the following diagram:
83: Select the true statements about Snowflake tasks:
- A task can execute a single SQL Statement
- A task can execute multiple SQL Statements
- A task can execute a call to a Stored Procedure
- A task can execute a function
Solution: 1, 3. Only one SQL statement is allowed to be executed through a task. If you need to execute multiple statements, build a procedure.
84: Which of these certifications are compliant with Snowflake?
- HIPAA
- PCI-DSS
- FedRAMP
Solution: 1, 2, 3. They won’t ask you in-depth questions about this topic in the exam, but it’s important to remember some of the most important ones. You can see other certifications at the following link: https://www.snowflake.com/snowflakes-security-compliance-reports.
85: Queries in Snowflake are getting queued in the warehouses and delaying the ETL processes of the company. What possible solution options can you think of considering we have the Snowflake Enterprise addition?
- Resize the warehouse
- Use multi-cluster warehouse
- Set auto-resize parameter to TRUE
- Contact Snowflake support to increase the size of the warehouse
Solution: 1, 2. By resizing the warehouse, your company will scale up, reducing the time to execute big queries. Using multi-cluster warehouses, you will have more queries running simultaneously and a high concurrency when they execute, and this is the definition of scaling out. You can see the differences between the different ways to scale in the following picture:
86: Which Snowflake edition (and above) allows until 90 days of Time Travel?
- Standard
- Enterprise
- Business Critical
Solution: 2. By default, Time travel is enabled with a 1-day retention period. However, we can increase it to 90 days if we have (at least) the Snowflake Enterprise Edition. It requires additional storage, which will be reflected in your monthly storage charges.
87: You have two virtual warehouses in your Snowflake account. If one of them updates the data in the storage layer, when will the other one see it?
- Immediately
- After an average time of 5 seconds
- After the sync process
Solution: 1. All the warehouses of your account share the storage layer, so if the data is updated, all the warehouses will be able to see it. You can see this behavior in the following image:
88: Which properties of a Resource Monitor can be modified?
- Credit Quota
- Monitor Level
- Schedule
- Actions
Solution: 1, 2, 3, 4. The Credit Quota specifies the number of Snowflake credits allocated to the monitor for the specified frequency interval. The Monitor Level specifies whether the resource monitor is used to monitor the credit usage for your entire account or individual warehouses. The Schedule indicates when the monitor will start monitoring and when the credits will reset to 0. Each action specifies a threshold and the action to perform when the threshold is reached within the specified interval.
89: Which database objects can be shared using Snowflake Secure Data Sharing?
- Tables
- External tables
- Secure views
- Secure materialized views
- Secure UDFs
- Users
Solution: 1, 2, 3, 4, 5. Secure Data Sharing lets you share selected objects in a database in your account with other Snowflake accounts. You can share all the previous database objects.
90: Does Snowflake support Nested Transactions?
- True
- False
Solution: 2
91: What actions can a Resource Monitor perform when it hits the limit?
- Notify & Suspend
- Notify & Suspend Immediately
- Notify
- Notify & Increase the limit
Solution: 1, 2, 3. Notify performs no action but sends an alert notification (email/web UI). Notify & Suspend sends a notification and suspends all assigned warehouses after all statements being executed by the warehouse (s) have been completed. Notify & Suspend Immediately sends a notification and suspends all assigned warehouses immediately.
92: Is shared data always immediately available to Consumer Accounts?
- True
- False
Solution: 1. New objects added to a share become immediately available to all consumers, providing real-time access to shared data, which is always up-to-date. Also, consumers don’t pay for storage, as the producer account already pays for it.
93: Which of the following options are true about roles in Snowflake?
- Snowflake users have a limit on the number of roles that they can assume
- Snowflake users can have one or more roles
- Only a role can be active for a particular session
- Privileges can be directly assigned to users
Solution: 2, 3. Each user can be assigned multiple roles (and vice versa) but can assume only one role simultaneously. Privileges are assigned to roles; that’s why the last option is false.
94: Can you load data using the PUT command through worksheets in the Snowflake UI?
- True
- False
Solution: 2. We can use the PUT command to UPLOAD files from a local directory/folder on a client machine into INTERNAL STAGES. It does NOT work with external stages, and we cannot use it from the Snowflake Web UI.
95: Do UDFs support both SQL & JavaScript in Snowflake?
- True
- False
Solution: 1. User-defined functions (UDFs) let you extend the system to perform operations that are not available through Snowflake’s built-in, system-defined functions. They support SQL, JavaScript, Java, and Python (these last two are new features).
96: Which function returns the name of the warehouse of the current session?
- ACTIVE_WAREHOUSE()
- RUNNING_WAREHOUSE()
- CURRENT_WAREHOUSE()
- WAREHOUSE()
Solution: 3. I’m not a big fan of learning commands by heart, and they are unlikely to appear on the exam, but this one may be useful. You have other commands to show the current database and schema, as you can see by executing the following command:
SELECT CURRENT_WAREHOUSE(), CURRENT_DATABASE(), CURRENT_SCHEMA();
97: Can a Warehouse be assigned to more than one Resource Monitor?
- True
- False
Solution: 2. A monitor can be assigned to one or more warehouses. However, a warehouse can be assigned to only a single resource monitor below the account level.
98: Which roles can create shares and resource monitors?
- ACCOUNTADMIN
- SECURITYADMIN
- SYSADMIN
- USERADMIN
Solution: 1. ACCOUNTADMIN is the only role that is able to create Shares and Resource Monitors by default. However, account administrators can choose to enable users with other roles to view and modify resource monitors using SQL.
99: Which Snowflake object returns a set of rows instead of a single, scalar value, and can be accessed in the FROM clause of a query?
- UDF
- UDTF
- Stored procedure
Solution: 2. User-defined functions (UDFs) let you extend the system to perform operations that are not available through Snowflake’s built-in, system-defined functions. UDTFs can return multiple rows for each input row; that’s the only difference with UDFs.
100: To improve the performance, which of the below techniques can be used in Snowflake?
- Cluster Keys
- Multi-Warehouses
- Maximize the cache use
- Increasing the Warehouse Size
- Dedicated Warehouses
Solution: 1, 2, 3, 4, 5
More Practice Exam Questions?
- Do you want more than 300 Snowflake SnowPro Core practice questions?
- Access to a real exam simulator to thoroughly prepare for the exam.
- Do you want to download THESE questions in PDF FOR FREE?
- Download the ultimate cheat sheet for the Snowflake SnowPro Core exam!
Visit FullCertified today and maximize your certification preparation!
Thanks for Reading!
If you like my work and want to support me:
- You can follow me on Medium here.
- Feel free to clap if this post is helpful for you! :)