summaryrefslogtreecommitdiffstats
path: root/sys
diff options
context:
space:
mode:
authorkettenis <kettenis@openbsd.org>2017-03-12 21:05:25 +0000
committerkettenis <kettenis@openbsd.org>2017-03-12 21:05:25 +0000
commit1155022125311d0d60a6880adf895511cba73ef3 (patch)
treecbecfe61382274c88bfa258af8891ba18ff99fbf /sys
parentBring SROP mitigation to arm64. Make some small modifications to the arm (diff)
downloadwireguard-openbsd-1155022125311d0d60a6880adf895511cba73ef3.tar.xz
wireguard-openbsd-1155022125311d0d60a6880adf895511cba73ef3.zip
Add a "dsm ishst" barrier before TLB maintenance instructions. The ARMv8
architecture reference manual says this is required (D4.7 under "Ordering and completion of TLB maintenance instructions" to guarantee that the translation table walk can observe previous store to the page tables. It also has a note that says In all cases in this section, where a DMB or DSB is referred to, it refers to a DMB or DSB whose required access type is both loads and stores. But both Linux and FreeBSD use a Store-Store barrier here. Sadly this doesn't fix the arm64 stability problems (or at least not all of them). ok patrick@
Diffstat (limited to 'sys')
-rw-r--r--sys/arch/arm64/arm64/cpufunc_asm.S5
1 files changed, 4 insertions, 1 deletions
diff --git a/sys/arch/arm64/arm64/cpufunc_asm.S b/sys/arch/arm64/arm64/cpufunc_asm.S
index 2bfef7769b0..3c5e8dec18f 100644
--- a/sys/arch/arm64/arm64/cpufunc_asm.S
+++ b/sys/arch/arm64/arm64/cpufunc_asm.S
@@ -1,4 +1,4 @@
-/* $OpenBSD: cpufunc_asm.S,v 1.1 2017/02/06 19:23:45 patrick Exp $ */
+/* $OpenBSD: cpufunc_asm.S,v 1.2 2017/03/12 21:05:25 kettenis Exp $ */
/*-
* Copyright (c) 2014 Robin Randhawa
* Copyright (c) 2015 The FreeBSD Foundation
@@ -87,6 +87,7 @@ ENTRY(cpu_setttb)
END(cpu_setttb)
ENTRY(cpu_tlb_flush)
+ dsb ishst
tlbi vmalle1is
dsb ish
isb
@@ -94,6 +95,7 @@ ENTRY(cpu_tlb_flush)
END(cpu_tlb_flush)
ENTRY(cpu_tlb_flush_asid)
+ dsb ishst
tlbi vae1is, x0
dsb ish
isb
@@ -101,6 +103,7 @@ ENTRY(cpu_tlb_flush_asid)
END(cpu_tlb_flush_asid)
ENTRY(cpu_tlb_flush_all_asid)
+ dsb ishst
tlbi vaale1is, x0
dsb ish
isb