Skip to content

Commit

Permalink
Apply @mofeing suggestions from code review
Browse files Browse the repository at this point in the history
Co-authored-by: Sergio Sánchez Ramírez <[email protected]>
  • Loading branch information
jofrevalles and mofeing authored Sep 15, 2023
1 parent bc53d77 commit d71674b
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 11 deletions.
9 changes: 0 additions & 9 deletions docs/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -105,13 +105,4 @@ @article{arute2019quantum
year={2019},
publisher={Nature Publishing Group}
}
@article{gray2021hyper,
title={Hyper-optimized tensor network contraction},
author={Gray, Johnnie and Kourtis, Stefanos},
journal={Quantum},
volume={5},
pages={410},
year={2021},
publisher={Verein zur F{\"o}rderung des Open Access Publizierens in den Quantenwissenschaften}
}
}
4 changes: 2 additions & 2 deletions docs/src/transformations.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ In tensor network computations, it is good practice to apply various transformat

A crucial reason why these methods are indispensable lies in their ability to drastically reduce the problem size of the contraction path search and also the contraction. This doesn't necessarily involve reducing the maximum rank of the Tensor Network itself, but more importantly, it reduces the size (or rank) of the involved tensors.

Our approach has been significantly inspired by the ideas presented in the [Quimb](https://quimb.readthedocs.io/) library, explained in [this paper](https://arxiv.org/pdf/2002.01935.pdf)[gray2021hyper](@cite).
Our approach is based in [gray2021hyper](@cite), which can also be found in [quimb](https://quimb.readthedocs.io/).

In Tenet, we provide a set of predefined transformations which you can apply to your `TensorNetwork` using both the `transform`/`transform!` functions.

Expand Down Expand Up @@ -249,7 +249,7 @@ fig #hide

## Example: RQC simplification

Here we show how can we reduce the complexity of the tensor network by applying a tranformation to it. We take as an example the Sycamore circuit from the Google's quantum advantage paper[arute2019quantum](@cite).
Local transformations can dramatically reduce the complexity of tensor networks. Take as an example the Random Quantum Circuit circuit on Sycamore chip from Google's quantum advantage experiment [arute2019quantum](@cite).

```@setup plot
using Makie
Expand Down

0 comments on commit d71674b

Please sign in to comment.