It was returning an int, which doesn't work if the u32 it is reading,
or the default value, will overflow a signed int.

While it could be made to work, when using a C standard/compiler where
casting negative signed values to unsigned has a defined behavior,
combined with careful casting, it seems obvious one is meant to use
ofnode_read_s32_default() with signed values.

Cc: Simon Glass <s...@chromium.org>
Signed-off-by: Trent Piepho <tpie...@impinj.com>
---
 drivers/core/ofnode.c | 2 +-
 include/dm/ofnode.h   | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/drivers/core/ofnode.c b/drivers/core/ofnode.c
index cc0c031e0d..0185f83399 100644
--- a/drivers/core/ofnode.c
+++ b/drivers/core/ofnode.c
@@ -39,7 +39,7 @@ int ofnode_read_u32(ofnode node, const char *propname, u32 
*outp)
        return 0;
 }
 
-int ofnode_read_u32_default(ofnode node, const char *propname, u32 def)
+u32 ofnode_read_u32_default(ofnode node, const char *propname, u32 def)
 {
        assert(ofnode_valid(node));
        ofnode_read_u32(node, propname, &def);
diff --git a/include/dm/ofnode.h b/include/dm/ofnode.h
index d206ee2caa..dcda22b31b 100644
--- a/include/dm/ofnode.h
+++ b/include/dm/ofnode.h
@@ -224,7 +224,7 @@ static inline int ofnode_read_s32(ofnode node, const char 
*propname,
  * @def:       default value to return if the property has no value
  * @return property value, or @def if not found
  */
-int ofnode_read_u32_default(ofnode ref, const char *propname, u32 def);
+u32 ofnode_read_u32_default(ofnode ref, const char *propname, u32 def);
 
 /**
  * ofnode_read_s32_default() - Read a 32-bit integer from a property
-- 
2.14.5

_______________________________________________
U-Boot mailing list
U-Boot@lists.denx.de
https://lists.denx.de/listinfo/u-boot

Reply via email to