Home
last modified time | relevance | path

Searched refs:smp_mb__after_spinlock (Results 1 – 21 of 21) sorted by relevance

/linux-6.3-rc2/tools/memory-model/litmus-tests/
A DMP+polockmbonce+poacquiresilsil.litmus6 * Do spinlocks combined with smp_mb__after_spinlock() provide order
18 smp_mb__after_spinlock();
A DZ6.0+pooncelock+poonceLock+pombonce.litmus6 * This litmus test demonstrates how smp_mb__after_spinlock() may be
27 smp_mb__after_spinlock();
A DREADME74 Protect the access with a lock and an smp_mb__after_spinlock()
153 As above, but with smp_mb__after_spinlock() immediately
/linux-6.3-rc2/kernel/kcsan/
A Dselftest.c148 KCSAN_CHECK_READ_BARRIER(smp_mb__after_spinlock()); in test_barrier()
177 KCSAN_CHECK_WRITE_BARRIER(smp_mb__after_spinlock()); in test_barrier()
209 KCSAN_CHECK_RW_BARRIER(smp_mb__after_spinlock()); in test_barrier()
A Dkcsan_test.c578 KCSAN_EXPECT_READ_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
623 KCSAN_EXPECT_WRITE_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
668 KCSAN_EXPECT_RW_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
/linux-6.3-rc2/arch/xtensa/include/asm/
A Dspinlock.h18 #define smp_mb__after_spinlock() smp_mb() macro
/linux-6.3-rc2/arch/csky/include/asm/
A Dspinlock.h10 #define smp_mb__after_spinlock() smp_mb() macro
/linux-6.3-rc2/arch/powerpc/include/asm/
A Dspinlock.h14 #define smp_mb__after_spinlock() smp_mb() macro
/linux-6.3-rc2/arch/arm64/include/asm/
A Dspinlock.h12 #define smp_mb__after_spinlock() smp_mb() macro
/linux-6.3-rc2/arch/riscv/include/asm/
A Dbarrier.h72 #define smp_mb__after_spinlock() RISCV_FENCE(iorw,iorw) macro
/linux-6.3-rc2/include/linux/
A Dspinlock.h174 #ifndef smp_mb__after_spinlock
175 #define smp_mb__after_spinlock() kcsan_mb() macro
/linux-6.3-rc2/tools/memory-model/
A Dlinux-kernel.bell33 'after-spinlock (*smp_mb__after_spinlock*) ||
A Dlinux-kernel.def25 smp_mb__after_spinlock() { __fence{after-spinlock}; }
/linux-6.3-rc2/tools/memory-model/Documentation/
A Drecipes.txt160 of smp_mb__after_spinlock():
174 smp_mb__after_spinlock();
187 This addition of smp_mb__after_spinlock() strengthens the lock acquisition
A Dordering.txt160 o smp_mb__after_spinlock(), which provides full ordering subsequent
A Dexplanation.txt2596 smp_mb__after_spinlock(). The LKMM uses fence events with special
2608 smp_mb__after_spinlock() orders po-earlier lock acquisition
/linux-6.3-rc2/kernel/
A Dkthread.c1470 smp_mb__after_spinlock(); in kthread_unuse_mm()
A Dexit.c554 smp_mb__after_spinlock(); in exit_mm()
/linux-6.3-rc2/kernel/rcu/
A Dtree_nocb.h1052 smp_mb__after_spinlock(); /* Timer expire before wakeup. */ in do_nocb_deferred_wakeup_timer()
/linux-6.3-rc2/Documentation/RCU/
A DwhatisRCU.rst659 smp_mb__after_spinlock();
685 been able to write-acquire the lock otherwise. The smp_mb__after_spinlock()
/linux-6.3-rc2/kernel/sched/
A Dcore.c1771 smp_mb__after_spinlock(); in uclamp_sync_util_min_rt_default()
4160 smp_mb__after_spinlock(); in try_to_wake_up()
6540 smp_mb__after_spinlock(); in __schedule()

Completed in 43 milliseconds