Re: [PATCH v2] x86/asm: Pin sensitive CR4 bits

2019-02-21 Thread Kees Cook
On Thu, Feb 21, 2019 at 9:33 AM Sean Christopherson wrote: > On Wed, Feb 20, 2019 at 10:09:34AM -0800, Kees Cook wrote: > > + if (WARN_ONCE((val & cr4_pin) != cr4_pin, "cr4 bypass attempt?!\n")) > > Printing what bits diverged would be helpful in the unlikely event that the > WARN_ONCE

Re: [PATCH v2] x86/asm: Pin sensitive CR4 bits

2019-02-21 Thread Sean Christopherson
On Wed, Feb 20, 2019 at 10:09:34AM -0800, Kees Cook wrote: > Several recent exploits have used direct calls to the native_write_cr4() > function to disable SMEP and SMAP before then continuing their exploits > using userspace memory access. This pins bits of cr4 so that they cannot > be changed

Re: [PATCH v2] x86/asm: Pin sensitive CR4 bits

2019-02-21 Thread Kees Cook
On Thu, Feb 21, 2019 at 5:06 AM Solar Designer wrote: > > On Wed, Feb 20, 2019 at 01:20:58PM -0800, Kees Cook wrote: > > On Wed, Feb 20, 2019 at 10:49 AM Solar Designer wrote: > > > > > > On Wed, Feb 20, 2019 at 10:09:34AM -0800, Kees Cook wrote: > > > > + if (WARN_ONCE((val & cr4_pin) !=

Re: [PATCH v2] x86/asm: Pin sensitive CR4 bits

2019-02-21 Thread Solar Designer
On Wed, Feb 20, 2019 at 01:20:58PM -0800, Kees Cook wrote: > On Wed, Feb 20, 2019 at 10:49 AM Solar Designer wrote: > > > > On Wed, Feb 20, 2019 at 10:09:34AM -0800, Kees Cook wrote: > > > + if (WARN_ONCE((val & cr4_pin) != cr4_pin, "cr4 bypass attempt?!\n")) > > > + goto again; >

Re: [PATCH v2] x86/asm: Pin sensitive CR4 bits

2019-02-20 Thread Kees Cook
On Wed, Feb 20, 2019 at 10:49 AM Solar Designer wrote: > > On Wed, Feb 20, 2019 at 10:09:34AM -0800, Kees Cook wrote: > > + if (WARN_ONCE((val & cr4_pin) != cr4_pin, "cr4 bypass attempt?!\n")) > > + goto again; > > I think "goto again" is too mild a response given that it occurs

Re: [PATCH v2] x86/asm: Pin sensitive CR4 bits

2019-02-20 Thread Solar Designer
On Wed, Feb 20, 2019 at 10:09:34AM -0800, Kees Cook wrote: > +extern volatile unsigned long cr4_pin; > + > static inline void native_write_cr4(unsigned long val) > { > +again: > + val |= cr4_pin; > asm volatile("mov %0,%%cr4": : "r" (val), "m" (__force_order)); > + /* > + * If

[PATCH v2] x86/asm: Pin sensitive CR4 bits

2019-02-20 Thread Kees Cook
Several recent exploits have used direct calls to the native_write_cr4() function to disable SMEP and SMAP before then continuing their exploits using userspace memory access. This pins bits of cr4 so that they cannot be changed through a common function. This is not intended to be general ROP