Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement immediate/synchronous effects #2822

Open
gbj opened this issue Aug 13, 2024 · 3 comments
Open

Implement immediate/synchronous effects #2822

gbj opened this issue Aug 13, 2024 · 3 comments
Milestone

Comments

@gbj
Copy link
Collaborator

gbj commented Aug 13, 2024

Is your feature request related to a problem? Please describe.

0.7 changes effect scheduling to automatically batch effects, and to run them asynchronously, as determined by whichever async runtime is being used. In the default web case, this is wasm-bindgen-futures, which uses queueMicrotask. In other words, updating a signal updates its value immediately, but runs effects on the next microtask. This means that an effect does not run immediately, once per signal change.

In general, this is good, because effects are intended to synchronize the current/latest value of a signal with the outside world in some way that is typically orders of magnitude more expensive than actually updating the signal: for example, updating a DOM node.

This has several benefits:

  1. Updating a signal multiple times, synchronously, does not run that expensive effect (like updating a DOM node) multiple times in a way that's unnecessary
  2. Updating multiple signals read in an effect does not run that effect multiple times. (Currently, users can opt into batching using batch(); the number of bug reports in which the solution was "use batch()" and that wasn't obvious to the user is more than a handful.)

The canonical example of why this is good would be something like

let a = RwSignal::new(1);
let b = RwSignal::new(2);

Effect::new(move |_| {
  let value = a.get() + b.get();
  // do something expensive with `value`, like updating the DOM
});

// ... 
on:click=move |_| {
  a.set(3);
  b.set(4);
}

// how many times did you want the effect to run?
// did you really intend there to be an inconsistent state
// where the effect had the old value of `b` and new value of `a`?

However, there are some situations in which users want an effect that runs immediately when signals are changed, and that does not batch in this way.

Describe the solution you'd like

An alternative effect type whose ReactiveGraph and Subscriber implementations are designed such that, when marked check or dirty, it simply runs its inner function, instead of triggering a notification to wake up an async task.

This means Effect and render effects (created for closures passed into the view, for example) run in the async, batched way, but users can opt into immediate effects for other cases.

Describe alternatives you've considered

  1. Making all effects immediate, and requiring opt-in batching. (Status quo ante 0.7)
  2. A flush_sync() that allows you to tell all effects to run now, not on the next microtask. (Difficult to implement because the async runtimes we piggyback on don't expose what we need to do this.)
@gbj gbj added this to the 0.7 milestone Aug 13, 2024
@connorgmeehan
Copy link

I have a use-case that would benefit from this. I am building a custom renderer where the data is owned externally from leptos, therefore I need to be able to know when leptos wants access to this data so I can pause other processes and make that data available. Maybe another solution that would work for me is being able to manage the async effects myself, i.e. implementing an EffectQue trait and executing that from my "other processes".

@benwis
Copy link
Contributor

benwis commented Aug 14, 2024

I have a use-case that would benefit from this. I am building a custom renderer where the data is owned externally from leptos, therefore I need to be able to know when leptos wants access to this data so I can pause other processes and make that data available. Maybe another solution that would work for me is being able to manage the async effects myself, i.e. implementing an EffectQue trait and executing that from my "other processes".

I'm curious if you've looked at something like RenderEffect which is synchronous and runs immediately:
https://docs.rs/leptos/0.7.0-beta/leptos/prelude/struct.RenderEffect.html

@connorgmeehan
Copy link

connorgmeehan commented Aug 14, 2024

I'm curious if you've looked at something like RenderEffect which is synchronous and runs immediately:

Hmmm I know of it but I didn't think about it. The effects I'm having trouble with are produced internally by leptons (I think to update the renderer). I'll see if there's a way I can make it use RenderEffect instead.

Update: It is using RenderEffect under the hood to re-render the markup, however the process is scheduled using wasm_bindgen's spawn_local so it's running on the next tick. I've attached the callstack for sharing sake.

Callstack
with_world: Could not get the `&World` from the static RefCell.  This means that this was called out of sync.

Stack:

__wbg_get_imports/imports.wbg.__wbg_new_abda76e883ba8a5f@http://localhost:8080/bevy-5c00443804c794be.js:1938:21
bevy-93e47f7eb941ef20.wasm.console_error_panic_hook::hook::h47c963ebcad3a381@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[7678]:0xf2cb1c
bevy-93e47f7eb941ef20.wasm.core::ops::function::Fn::call::h807d8b8a14f16a5e@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[86117]:0x1a7baa2
bevy-93e47f7eb941ef20.wasm.std::panicking::rust_panic_with_hook::h6731baa78621a747@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[25150]:0x16727d7
bevy-93e47f7eb941ef20.wasm.std::panicking::begin_panic_handler::{{closure}}::hb6cd8464ed39ae71@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[30902]:0x1798e70
bevy-93e47f7eb941ef20.wasm.std::sys_common::backtrace::__rust_end_short_backtrace::hbdf3ddeb21a1e747@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[84715]:0x1a78b23
bevy-93e47f7eb941ef20.wasm.rust_begin_unwind@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[48499]:0x197ead3
bevy-93e47f7eb941ef20.wasm.core::panicking::panic_fmt::h5c7ce52813e94bcd@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[49690]:0x1991e64
bevy-93e47f7eb941ef20.wasm.core::panicking::panic_display::hbd841ae85eb3dff4@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[47629]:0x196efa7
bevy-93e47f7eb941ef20.wasm.core::option::expect_failed::h637b3c5bf9792ea8@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[53865]:0x19ca5c8
bevy-93e47f7eb941ef20.wasm.std::thread::local::LocalKey<T>::with::h431ade8fa496fabe@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[25151]:0x16728e4
bevy-93e47f7eb941ef20.wasm.bevy::leptos_bevy::core::BevyLeptosState::untrack_resource::h00ea65125970feae@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[2318]:0x8a3047
bevy-93e47f7eb941ef20.wasm.reactive_graph::traits::Update::update::h97dbc16cfd088dbd@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[16988]:0x14073f1
bevy-93e47f7eb941ef20.wasm.reactive_graph::computed::create_write_slice::{{closure}}::h2c5689d6de0db529@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[47809]:0x1972717
bevy-93e47f7eb941ef20.wasm.reactive_graph::owner::arena::Arena::with::h2de8bb1c52f05a48@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[12946]:0x125d00f
bevy-93e47f7eb941ef20.wasm.<reactive_graph::wrappers::write::SignalSetter<T,S> as reactive_graph::traits::Set>::set::hc9d1b46c948a186a@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[21459]:0x157e51e
bevy-93e47f7eb941ef20.wasm.core::ops::function::FnOnce::call_once{{vtable.shim}}::h5024550bfba8c689@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[45964]:0x194c94d
bevy-93e47f7eb941ef20.wasm.<reactive_graph::owner::OwnerInner as core::ops::drop::Drop>::drop::hbaa07fa646fd9d77@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[11518]:0x11a6016
bevy-93e47f7eb941ef20.wasm.core::ptr::drop_in_place<std::sync::rwlock::RwLock<reactive_graph::owner::OwnerInner>>::h9a87dcb4b02d9505@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[15906]:0x139d33a
bevy-93e47f7eb941ef20.wasm.alloc::sync::Arc<T,A>::drop_slow::h0ab35c3e2fb5eea1@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[48660]:0x198155f
bevy-93e47f7eb941ef20.wasm.reactive_graph::effect::render_effect::RenderEffect<T>::new_with_value::erased::{{closure}}::h9bbd88a5f17d5fd3@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[9323]:0x105f07a
bevy-93e47f7eb941ef20.wasm.<core::pin::Pin<P> as core::future::future::Future>::poll::h2b6dee911f63217d@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[63183]:0x1a113c0
bevy-93e47f7eb941ef20.wasm.wasm_bindgen_futures::queue::QueueState::run_all::ha04814a86dc3168f@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[11436]:0x119b4c3
bevy-93e47f7eb941ef20.wasm.wasm_bindgen_futures::queue::Queue::new::{{closure}}::hedd756be345dae2c@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[33222]:0x17f44ac
bevy-93e47f7eb941ef20.wasm.<dyn core::ops::function::FnMut<(A,)>+Output = R as wasm_bindgen::closure::WasmClosure>::describe::invoke::hc5aab3773a297ddd@http://localhost:8080/bevy-5c00443804c794be_bg.wasm:wasm-function[55971]:0x19df628
__wbg_adapter_58@http://localhost:8080/bevy-5c00443804c794be.js:240:10
real@http://localhost:8080/bevy-5c00443804c794be.js:209:20

Edit 2: I was able to implement a flush_sync for my use-case but I don't know if it'd work in the web context. If the any_spawner crate provided an init_any then people would be able to manage their own futures. I stored my futures in a thread_local queue (my leptos code has to run in the main thread anyway) and had a function to poll all futures. It might also make sense to do this instead of have the glibc feature flag for the glibc example?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants