You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/data-engineering/author-execute-notebook.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -167,10 +167,10 @@ Select the **More commands** ellipses (...) on the cell toolbar and **Hide outpu
167
167
168
168
### Cell output security
169
169
170
-
Using [OneLake data access roles (preview)](../onelake/security/get-started-onelake-security.md), users can configure access to only specific folders in a lakehouse during notebook queries. Users without access to a folder or table see an unauthorized error during query execution.
170
+
You can use [OneLake data access roles (preview)](../onelake/security/get-started-onelake-security.md) to configure access to only specific folders in a lakehouse during notebook queries. Users without access to a folder or table see an unauthorized error during query execution.
171
171
172
172
> [!IMPORTANT]
173
-
> Security only applies during query execution and any notebook cells containing query results can be viewed by users that aren't authorized to run queries against the data directly.
173
+
> Security only applies during query execution. Notebook cells that contain query results can be viewed by users that aren't authorized to run queries against the data directly.
174
174
175
175
### Lock or freeze a cell
176
176
@@ -204,7 +204,7 @@ The find and replace option can help you match and locate the keywords or expres
204
204
205
205
:::image type="content" source="media\author-execute-notebook\find-replace.png" alt-text="Screenshot showing find and replace pane." lightbox="media\author-execute-notebook\find-replace.png":::
206
206
207
-
## Copilot inline code completion (Preview)
207
+
## Copilot inline code completion (preview)
208
208
209
209
Copilot inline code completion is an AI-powered feature that helps you to write Python code faster and more efficiently in Fabric Notebooks. This feature provides intelligent, context-aware code suggestions as you type code. It reduces repetitive tasks, minimizes syntax errors, and accelerates development by integrating seamlessly into your notebook workflow.
210
210
@@ -213,7 +213,7 @@ Copilot inline code completion is an AI-powered feature that helps you to write
213
213
***AI-driven completions:** Generates suggestions based on your notebook's context using a model trained on millions of lines of code.
214
214
***Boosts productivity:** Helps write complex functions, reduces repetitive coding, and speeds up exploration of unfamiliar libraries.
215
215
***Reduces errors:** Minimizes typos and syntax mistakes with intelligent, context-aware completions.
216
-
***Minimal setup:** Built into Fabric notebooks, doesn't require any installation. You can just enable it and start coding.
216
+
***Minimal setup:** Built into Fabric notebooks and doesn't require any installation. You can just enable it and start coding.
217
217
218
218
### How it works
219
219
@@ -361,15 +361,15 @@ You can also set timeout as described in:
361
361
362
362
**How do ABT and idle session timeout impact long-running Fabric Notebook executions?**
363
363
364
-
If your tenant uses activity-based timeout (ABT), long-running interactive jobs in Fabric notebooks may be impacted by Microsoft 365's idle session timeout policy. This security feature is designed to sign out users on inactive, nonmanaged devices, even if a notebook job is still running. While activity in other Microsoft 365 apps can keep the session alive, idle devices are signed out by design.
364
+
If your tenant uses activity-based timeout (ABT), long-running interactive jobs in Fabric notebooks might be impacted by Microsoft 365's idle session timeout policy. This security feature is designed to sign out users on inactive, nonmanaged devices, even if a notebook job is still running. While activity in other Microsoft 365 apps can keep the session alive, idle devices are signed out by design.
365
365
366
366
**Why are users signed out even when a notebook job is still running?**
367
367
368
368
Idle session timeout prioritizes security by ending sessions on inactive devices to prevent unauthorized access. Even when a notebook execution is in progress, the session ends if the device shows no activity. Keeping sessions open on idle devices would compromise security, which is why the current behavior is enforced.
369
369
370
370
### Inline Apache Spark job indicator
371
371
372
-
The Fabric notebook is Apache Spark based. Code cells are executed on the Apache Spark cluster remotely. A Spark job progress indicator is provided with a real-time progress bar that appears to help you understand the job execution status. The number of tasks per each job or stage helps you to identify the parallel level of your Spark job. You can also drill deeper to the Spark UI of a specific job (or stage) via selecting the link on the job (or stage) name.
372
+
Fabric notebooks are Apache Spark based. Code cells are executed on the Apache Spark cluster remotely. A Spark job progress indicator is provided with a real-time progress bar that appears to help you understand the job execution status. The number of tasks per each job or stage helps you to identify the parallel level of your Spark job. You can also drill deeper to the Spark UI of a specific job (or stage) via selecting the link on the job (or stage) name.
373
373
374
374
You can also find the **Cell level real-time log** next to the progress indicator, and **Diagnostics** can provide you with useful suggestions to help refine and debug the code.
375
375
@@ -576,7 +576,7 @@ You can personalize your Spark session with the magic command **%%configure**. F
576
576
> - We recommend that you set the same value for "DriverMemory" and "ExecutorMemory" in %%configure. The "driverCores" and "executorCores" values should also be the same.
577
577
> - The "defaultLakehouse" will overwrite your pinned lakehouse in Lakehouse explorer, but that only works in your current notebook session.
578
578
> - You can use %%configure in Fabric pipelines, but if it's not set in the first code cell, the pipeline run fails due to can't restart session.
579
-
> - The %%configure used in notebookutils.notebook.run will be ignored but used in %run notebook will continue executing.
579
+
> - The %%configure used in notebookutils.notebook.run is ignored but used in %run notebook continues executing.
580
580
> - The standard Spark configuration properties must be used in the "conf" body. Fabric doesn't support first level reference for the Spark configuration properties.
581
581
> - Some special Spark properties, including "spark.driver.cores", "spark.executor.cores", "spark.driver.memory", "spark.executor.memory", and "spark.executor.instances" don't take effect in "conf" body.
Copy file name to clipboardExpand all lines: docs/data-engineering/notebook-utilities.md
+16-14Lines changed: 16 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -113,21 +113,21 @@ notebookutils.fs.ls("file:/<new_dir>") # based on local file system of driver n
113
113
114
114
### Copy file
115
115
116
-
This method copies a file or directory, and supports copy activity across file systems.
116
+
This method copies a file or directory, and supports copy activity across file systems. We set `recurse=True` to copy all files and directories recursively.
117
117
118
118
```python
119
-
notebookutils.fs.cp('source file or directory', 'destination file or directory', True)# Set the third parameter as True to copy all files and directories recursively
119
+
notebookutils.fs.cp('source file or directory', 'destination file or directory', recurse=True)
120
120
```
121
121
122
122
> [!NOTE]
123
-
> Due to the [limitations of OneLake shortcut](../onelake/onelake-shortcuts.md#limitations-and-considerations), when you need to use ```notebookutils.fs.cp()``` to copy data from S3/GCS type shortcut, it is recommended to use a mounted path instead of an abfss path.
123
+
> Due to the [limitations of OneLake shortcut](../onelake/onelake-shortcuts.md#limitations-and-considerations), when you need to use `notebookutils.fs.cp()` to copy data from S3/GCS type shortcut, it is recommended to use a mounted path instead of an abfss path.
124
124
125
125
### Performant copy file
126
126
127
127
This method offers a more efficient approach to copying or moving files, particularly when dealing with large data volumes. For enhanced performance on Fabric, it is advisable to utilize `fastcp` as a substitute for the traditional `cp` method.
128
128
129
129
```python
130
-
notebookutils.fs.fastcp('source file or directory', 'destination file or directory', True)# Set the third parameter as True to copy all files and directories recursively
130
+
notebookutils.fs.fastcp('source file or directory', 'destination file or directory', recurse=True)
131
131
```
132
132
133
133
**Considerations:**
@@ -175,10 +175,10 @@ notebookutils.fs.append("file path", "content to append", True) # Set the last p
175
175
176
176
### Delete file or directory
177
177
178
-
This method removes a file or directory.
178
+
This method removes a file or directory. We set `recurse=True` to remove all files and directories recursively.
179
179
180
180
```python
181
-
notebookutils.fs.rm('file path', True) # Set the last parameter as True to remove all files and directories recursively
```notebookutils.udf``` provides utilities designed for integrating Notebook code with User Data Functions (UDFs). These utilities allow you to access functions from a UDF item within the same workspace or across different workspaces. You can then invoke functions within a UDF item as needed.
445
445
446
-
Here is an overview of the available methods:
446
+
Here are some examples of how to use the UDF utilities:
447
447
448
448
```python
449
-
# Get functions
450
-
myFunctions = notebookutils.udf.getFunctions('UDFItemName') # Get functions from UDF within the same workspace
451
-
myFunctions = notebookutils.udf.getFunctions('UDFItemName', 'workspaceId') # Get functions from UDF across different workspace
> - The notebook code references the variables defined in the active value set of the Variable Library.
1001
1003
1002
1004
1003
-
## Known issue
1005
+
## Known issues
1004
1006
1005
1007
- When using runtime version above 1.2 and run ``` notebookutils.help() ```, the listed **fabricClient**, **PBIClient** APIs are not supported for now, will be available in the further. Additionally, the **Credentials** API isn't supported in Scala notebooks for now.
0 commit comments