Skip to content

Commit 84c9eb1

Browse files
Merge pull request #2224 from MicrosoftDocs/main638939414920034066sync_temp
For protected branch, push strategy should use PR and merge to target branch method to work around git push error
2 parents 53468af + c309902 commit 84c9eb1

File tree

2 files changed

+23
-21
lines changed

2 files changed

+23
-21
lines changed

docs/data-engineering/author-execute-notebook.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -167,10 +167,10 @@ Select the **More commands** ellipses (...) on the cell toolbar and **Hide outpu
167167

168168
### Cell output security
169169

170-
Using [OneLake data access roles (preview)](../onelake/security/get-started-onelake-security.md), users can configure access to only specific folders in a lakehouse during notebook queries. Users without access to a folder or table see an unauthorized error during query execution.
170+
You can use [OneLake data access roles (preview)](../onelake/security/get-started-onelake-security.md) to configure access to only specific folders in a lakehouse during notebook queries. Users without access to a folder or table see an unauthorized error during query execution.
171171

172172
> [!IMPORTANT]
173-
> Security only applies during query execution and any notebook cells containing query results can be viewed by users that aren't authorized to run queries against the data directly.
173+
> Security only applies during query execution. Notebook cells that contain query results can be viewed by users that aren't authorized to run queries against the data directly.
174174
175175
### Lock or freeze a cell
176176

@@ -204,7 +204,7 @@ The find and replace option can help you match and locate the keywords or expres
204204

205205
:::image type="content" source="media\author-execute-notebook\find-replace.png" alt-text="Screenshot showing find and replace pane." lightbox="media\author-execute-notebook\find-replace.png":::
206206

207-
## Copilot inline code completion (Preview)
207+
## Copilot inline code completion (preview)
208208

209209
Copilot inline code completion is an AI-powered feature that helps you to write Python code faster and more efficiently in Fabric Notebooks. This feature provides intelligent, context-aware code suggestions as you type code. It reduces repetitive tasks, minimizes syntax errors, and accelerates development by integrating seamlessly into your notebook workflow.
210210

@@ -213,7 +213,7 @@ Copilot inline code completion is an AI-powered feature that helps you to write
213213
* **AI-driven completions:** Generates suggestions based on your notebook's context using a model trained on millions of lines of code.
214214
* **Boosts productivity:** Helps write complex functions, reduces repetitive coding, and speeds up exploration of unfamiliar libraries.
215215
* **Reduces errors:** Minimizes typos and syntax mistakes with intelligent, context-aware completions.
216-
* **Minimal setup:** Built into Fabric notebooks, doesn't require any installation. You can just enable it and start coding.
216+
* **Minimal setup:** Built into Fabric notebooks and doesn't require any installation. You can just enable it and start coding.
217217

218218
### How it works
219219

@@ -361,15 +361,15 @@ You can also set timeout as described in:
361361
362362
**How do ABT and idle session timeout impact long-running Fabric Notebook executions?**
363363

364-
If your tenant uses activity-based timeout (ABT), long-running interactive jobs in Fabric notebooks may be impacted by Microsoft 365's idle session timeout policy. This security feature is designed to sign out users on inactive, nonmanaged devices, even if a notebook job is still running. While activity in other Microsoft 365 apps can keep the session alive, idle devices are signed out by design.
364+
If your tenant uses activity-based timeout (ABT), long-running interactive jobs in Fabric notebooks might be impacted by Microsoft 365's idle session timeout policy. This security feature is designed to sign out users on inactive, nonmanaged devices, even if a notebook job is still running. While activity in other Microsoft 365 apps can keep the session alive, idle devices are signed out by design.
365365

366366
**Why are users signed out even when a notebook job is still running?**
367367

368368
Idle session timeout prioritizes security by ending sessions on inactive devices to prevent unauthorized access. Even when a notebook execution is in progress, the session ends if the device shows no activity. Keeping sessions open on idle devices would compromise security, which is why the current behavior is enforced.
369369

370370
### Inline Apache Spark job indicator
371371

372-
The Fabric notebook is Apache Spark based. Code cells are executed on the Apache Spark cluster remotely. A Spark job progress indicator is provided with a real-time progress bar that appears to help you understand the job execution status. The number of tasks per each job or stage helps you to identify the parallel level of your Spark job. You can also drill deeper to the Spark UI of a specific job (or stage) via selecting the link on the job (or stage) name.
372+
Fabric notebooks are Apache Spark based. Code cells are executed on the Apache Spark cluster remotely. A Spark job progress indicator is provided with a real-time progress bar that appears to help you understand the job execution status. The number of tasks per each job or stage helps you to identify the parallel level of your Spark job. You can also drill deeper to the Spark UI of a specific job (or stage) via selecting the link on the job (or stage) name.
373373

374374
You can also find the **Cell level real-time log** next to the progress indicator, and **Diagnostics** can provide you with useful suggestions to help refine and debug the code.
375375

@@ -576,7 +576,7 @@ You can personalize your Spark session with the magic command **%%configure**. F
576576
> - We recommend that you set the same value for "DriverMemory" and "ExecutorMemory" in %%configure. The "driverCores" and "executorCores" values should also be the same.
577577
> - The "defaultLakehouse" will overwrite your pinned lakehouse in Lakehouse explorer, but that only works in your current notebook session.
578578
> - You can use %%configure in Fabric pipelines, but if it's not set in the first code cell, the pipeline run fails due to can't restart session.
579-
> - The %%configure used in notebookutils.notebook.run will be ignored but used in %run notebook will continue executing.
579+
> - The %%configure used in notebookutils.notebook.run is ignored but used in %run notebook continues executing.
580580
> - The standard Spark configuration properties must be used in the "conf" body. Fabric doesn't support first level reference for the Spark configuration properties.
581581
> - Some special Spark properties, including "spark.driver.cores", "spark.executor.cores", "spark.driver.memory", "spark.executor.memory", and "spark.executor.instances" don't take effect in "conf" body.
582582

docs/data-engineering/notebook-utilities.md

Lines changed: 16 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -113,21 +113,21 @@ notebookutils.fs.ls("file:/<new_dir>") # based on local file system of driver n
113113

114114
### Copy file
115115

116-
This method copies a file or directory, and supports copy activity across file systems.
116+
This method copies a file or directory, and supports copy activity across file systems. We set `recurse=True` to copy all files and directories recursively.
117117

118118
```python
119-
notebookutils.fs.cp('source file or directory', 'destination file or directory', True)# Set the third parameter as True to copy all files and directories recursively
119+
notebookutils.fs.cp('source file or directory', 'destination file or directory', recurse=True)
120120
```
121121

122122
> [!NOTE]
123-
> Due to the [limitations of OneLake shortcut](../onelake/onelake-shortcuts.md#limitations-and-considerations), when you need to use ```notebookutils.fs.cp()``` to copy data from S3/GCS type shortcut, it is recommended to use a mounted path instead of an abfss path.
123+
> Due to the [limitations of OneLake shortcut](../onelake/onelake-shortcuts.md#limitations-and-considerations), when you need to use `notebookutils.fs.cp()` to copy data from S3/GCS type shortcut, it is recommended to use a mounted path instead of an abfss path.
124124
125125
### Performant copy file
126126

127127
This method offers a more efficient approach to copying or moving files, particularly when dealing with large data volumes. For enhanced performance on Fabric, it is advisable to utilize `fastcp` as a substitute for the traditional `cp` method.
128128

129129
```python
130-
notebookutils.fs.fastcp('source file or directory', 'destination file or directory', True)# Set the third parameter as True to copy all files and directories recursively
130+
notebookutils.fs.fastcp('source file or directory', 'destination file or directory', recurse=True)
131131
```
132132

133133
**Considerations:**
@@ -175,10 +175,10 @@ notebookutils.fs.append("file path", "content to append", True) # Set the last p
175175

176176
### Delete file or directory
177177

178-
This method removes a file or directory.
178+
This method removes a file or directory. We set `recurse=True` to remove all files and directories recursively.
179179

180180
```python
181-
notebookutils.fs.rm('file path', True) # Set the last parameter as True to remove all files and directories recursively
181+
notebookutils.fs.rm('file path', recurse=True)
182182
```
183183

184184
### Mount/unmount directory
@@ -443,20 +443,22 @@ artifacts_list = notebookutils.notebook.list("optional_workspace_id")
443443

444444
```notebookutils.udf``` provides utilities designed for integrating Notebook code with User Data Functions (UDFs). These utilities allow you to access functions from a UDF item within the same workspace or across different workspaces. You can then invoke functions within a UDF item as needed.
445445

446-
Here is an overview of the available methods:
446+
Here are some examples of how to use the UDF utilities:
447447

448448
```python
449-
# Get functions
450-
myFunctions = notebookutils.udf.getFunctions('UDFItemName') # Get functions from UDF within the same workspace
451-
myFunctions = notebookutils.udf.getFunctions('UDFItemName', 'workspaceId') # Get functions from UDF across different workspace
449+
# Get functions from a UDF item
450+
myFunctions = notebookutils.udf.getFunctions('UDFItemName')
451+
# Or from another workspace
452+
myFunctions = notebookutils.udf.getFunctions('UDFItemName', 'workspaceId')
452453

453-
# Additional helper method to return all functions, their respective parameters, and types.
454+
# Display function and item details
454455
display(myFunctions.functionDetails)
455456
display(myFunctions.itemDetails)
456457

457-
# Invoke the function
458+
# Invoke a function
458459
myFunctions.functionName('value1', 'value2')
459-
myFunctions.functionName(parameter1='value1', parameter2='value2'...) # Another way to invoke the function
460+
# Or with named parameters
461+
myFunctions.functionName(parameter1='value1', parameter2='value2')
460462
```
461463

462464
### Retrieve functions from a UDF
@@ -1000,7 +1002,7 @@ notebookutils.variableLibrary.get("$(/**/samplevl/test_bool)")
10001002
> - The notebook code references the variables defined in the active value set of the Variable Library.
10011003
10021004

1003-
## Known issue
1005+
## Known issues
10041006

10051007
- When using runtime version above 1.2 and run ``` notebookutils.help() ```, the listed **fabricClient**, **PBIClient** APIs are not supported for now, will be available in the further. Additionally, the **Credentials** API isn't supported in Scala notebooks for now.
10061008

0 commit comments

Comments
 (0)