Title: [288032] trunk
Revision
288032
Author
wenson_hs...@apple.com
Date
2022-01-14 14:23:35 -0800 (Fri, 14 Jan 2022)

Log Message

Avoid redundant text analysis requests when long pressing inside an image that contains Live Text
https://bugs.webkit.org/show_bug.cgi?id=235129
rdar://87366539

Reviewed by Tim Horton.

Source/WebKit:

When long pressing over a non-Live-Text part of an image that otherwise contains Live Text, we currently trigger
Live Text analysis on the image a second time and re-inject the same text recognition results into the image
element. In addition to being unnecessary, this extra text analysis operation causes regular Live Text results
to be injected into images that already contain "block"-style recognized text results, replacing the blocks in
the process.

Address this by adding an optimization to avoid this unnecessary Live Text analysis and injection. See below for
more details.

Test: ImageAnalysisTests.AvoidRedundantTextRecognitionRequests

* Shared/ios/InteractionInformationAtPosition.h:
* Shared/ios/InteractionInformationAtPosition.mm:
(WebKit::InteractionInformationAtPosition::encode const):
(WebKit::InteractionInformationAtPosition::decode):

Add a new `elementContainsImageOverlay` bit to position information.

* UIProcess/ios/WKContentViewInteraction.mm:
(-[WKContentView imageAnalysisGestureDidBegin:]):

In the case where the image already contains an overlay, short-circuit logic to perform text analysis and inject
the results into the image, and instead directly call into the new helper method below to compute visual look up
results and invoke pending context menu completion handler blocks. We pass `YES` for `hasTextResults` here
because we know that the image already contains recognized text, due to the `elementContainsImageOverlay` being
set.

(-[WKContentView _completeImageAnalysisRequestForContextMenu:requestIdentifier:hasTextResults:]):

Factor out logic for requesting visual look up results on the image into a separate helper method.

* WebProcess/WebPage/ios/WebPageIOS.mm:
(WebKit::videoPositionInformation):
(WebKit::imagePositionInformation):

Set the `elementContainsImageOverlay` bit (see above).

Tools:

Add a new API test and refactor some existing tests.

* TestWebKitAPI/Tests/WebKitCocoa/ImageAnalysisTests.mm:
(swizzledLocationInView):
(-[TestWKWebView simulateImageAnalysisGesture:]):

Add a helper method to simulate the image analysis gesture recognizer being activated on iOS, and wait for image
analysis to finish.

(TestWebKitAPI::swizzledProcessRequestWithResults):
(TestWebKitAPI::TEST):

Add a new API test to verify that we make at most one extra VKImageAnalyzer request when invoking a non-Live-
Text part of the image a second time. Additionally, adjust an existing test (HandleImageAnalyzerErrors) to
verify that we invoke VKImageAnalyzer a total of two times during the course of one image analysis gesture (once
for Live Text recognition, and another for visual look up). This test currently passes with only a single check
because it ends too early (i.e. after only one round trip to the web process and back).

(TestWebKitAPI::swizzledLocationInView): Deleted.
* TestWebKitAPI/cocoa/ImageAnalysisTestingUtilities.mm:
(-[TestVKImageAnalysis hasResultsForAnalysisTypes:]):

Implement this method stub to avoid an unrecognized selector crash.

Modified Paths

Diff

Modified: trunk/Source/WebKit/ChangeLog (288031 => 288032)


--- trunk/Source/WebKit/ChangeLog	2022-01-14 21:58:40 UTC (rev 288031)
+++ trunk/Source/WebKit/ChangeLog	2022-01-14 22:23:35 UTC (rev 288032)
@@ -1,3 +1,48 @@
+2022-01-14  Wenson Hsieh  <wenson_hs...@apple.com>
+
+        Avoid redundant text analysis requests when long pressing inside an image that contains Live Text
+        https://bugs.webkit.org/show_bug.cgi?id=235129
+        rdar://87366539
+
+        Reviewed by Tim Horton.
+
+        When long pressing over a non-Live-Text part of an image that otherwise contains Live Text, we currently trigger
+        Live Text analysis on the image a second time and re-inject the same text recognition results into the image
+        element. In addition to being unnecessary, this extra text analysis operation causes regular Live Text results
+        to be injected into images that already contain "block"-style recognized text results, replacing the blocks in
+        the process.
+
+        Address this by adding an optimization to avoid this unnecessary Live Text analysis and injection. See below for
+        more details.
+
+        Test: ImageAnalysisTests.AvoidRedundantTextRecognitionRequests
+
+        * Shared/ios/InteractionInformationAtPosition.h:
+        * Shared/ios/InteractionInformationAtPosition.mm:
+        (WebKit::InteractionInformationAtPosition::encode const):
+        (WebKit::InteractionInformationAtPosition::decode):
+
+        Add a new `elementContainsImageOverlay` bit to position information.
+
+        * UIProcess/ios/WKContentViewInteraction.mm:
+        (-[WKContentView imageAnalysisGestureDidBegin:]):
+
+        In the case where the image already contains an overlay, short-circuit logic to perform text analysis and inject
+        the results into the image, and instead directly call into the new helper method below to compute visual look up
+        results and invoke pending context menu completion handler blocks. We pass `YES` for `hasTextResults` here
+        because we know that the image already contains recognized text, due to the `elementContainsImageOverlay` being
+        set.
+
+        (-[WKContentView _completeImageAnalysisRequestForContextMenu:requestIdentifier:hasTextResults:]):
+
+        Factor out logic for requesting visual look up results on the image into a separate helper method.
+
+        * WebProcess/WebPage/ios/WebPageIOS.mm:
+        (WebKit::videoPositionInformation):
+        (WebKit::imagePositionInformation):
+
+        Set the `elementContainsImageOverlay` bit (see above).
+
 2022-01-14  Per Arne Vollan  <pvol...@apple.com>
 
         Inject Launch Services database before NSApplication is initialized

Modified: trunk/Source/WebKit/Shared/ios/InteractionInformationAtPosition.h (288031 => 288032)


--- trunk/Source/WebKit/Shared/ios/InteractionInformationAtPosition.h	2022-01-14 21:58:40 UTC (rev 288031)
+++ trunk/Source/WebKit/Shared/ios/InteractionInformationAtPosition.h	2022-01-14 22:23:35 UTC (rev 288032)
@@ -81,6 +81,7 @@
 #if ENABLE(DATALIST_ELEMENT)
     bool preventTextInteraction { false };
 #endif
+    bool elementContainsImageOverlay { false };
     bool shouldNotUseIBeamInEditableContent { false };
     bool isImageOverlayText { false };
     bool isVerticalWritingMode { false };

Modified: trunk/Source/WebKit/Shared/ios/InteractionInformationAtPosition.mm (288031 => 288032)


--- trunk/Source/WebKit/Shared/ios/InteractionInformationAtPosition.mm	2022-01-14 21:58:40 UTC (rev 288031)
+++ trunk/Source/WebKit/Shared/ios/InteractionInformationAtPosition.mm	2022-01-14 22:23:35 UTC (rev 288032)
@@ -82,6 +82,7 @@
 #if ENABLE(DATALIST_ELEMENT)
     encoder << preventTextInteraction;
 #endif
+    encoder << elementContainsImageOverlay;
     encoder << shouldNotUseIBeamInEditableContent;
     encoder << isImageOverlayText;
     encoder << isVerticalWritingMode;
@@ -212,6 +213,9 @@
         return false;
 #endif
 
+    if (!decoder.decode(result.elementContainsImageOverlay))
+        return false;
+
     if (!decoder.decode(result.shouldNotUseIBeamInEditableContent))
         return false;
 

Modified: trunk/Source/WebKit/UIProcess/ios/WKContentViewInteraction.mm (288031 => 288032)


--- trunk/Source/WebKit/UIProcess/ios/WKContentViewInteraction.mm	2022-01-14 21:58:40 UTC (rev 288031)
+++ trunk/Source/WebKit/UIProcess/ios/WKContentViewInteraction.mm	2022-01-14 22:23:35 UTC (rev 288032)
@@ -10395,6 +10395,11 @@
         auto requestForTextSelection = [strongSelf createImageAnalyzerRequest:VKAnalysisTypeText image:cgImage.get()];
         auto requestForContextMenu = [strongSelf createImageAnalyzerRequest:VKAnalysisTypeVisualSearch | VKAnalysisTypeMachineReadableCode | VKAnalysisTypeAppClip image:cgImage.get()];
 
+        if (information.elementContainsImageOverlay) {
+            [strongSelf _completeImageAnalysisRequestForContextMenu:requestForContextMenu.get() requestIdentifier:requestIdentifier hasTextResults:YES];
+            return;
+        }
+
         auto textAnalysisStartTime = MonotonicTime::now();
         [[strongSelf imageAnalyzer] processRequest:requestForTextSelection.get() progressHandler:nil completionHandler:[requestIdentifier = WTFMove(requestIdentifier), weakSelf, elementContext, requestLocation, requestForContextMenu, gestureDeferralToken, textAnalysisStartTime] (VKImageAnalysis *result, NSError *error) mutable {
             auto strongSelf = weakSelf.get();
@@ -10422,37 +10427,42 @@
                     return;
                 }
 
-                auto visualSearchAnalysisStartTime = MonotonicTime::now();
-                [[strongSelf imageAnalyzer] processRequest:requestForContextMenu.get() progressHandler:nil completionHandler:[requestIdentifier = WTFMove(requestIdentifier), weakSelf, hasTextResults, visualSearchAnalysisStartTime] (VKImageAnalysis *result, NSError *error) mutable {
-                    auto strongSelf = weakSelf.get();
-                    if (![strongSelf validateImageAnalysisRequestIdentifier:requestIdentifier])
-                        return;
+                [strongSelf _completeImageAnalysisRequestForContextMenu:requestForContextMenu.get() requestIdentifier:requestIdentifier hasTextResults:hasTextResults];
+            });
+        }];
+    } forRequest:request];
+}
 
+- (void)_completeImageAnalysisRequestForContextMenu:(VKImageAnalyzerRequest *)requestForContextMenu requestIdentifier:(WebKit::ImageAnalysisRequestIdentifier)requestIdentifier hasTextResults:(BOOL)hasTextResults
+{
+    auto visualSearchAnalysisStartTime = MonotonicTime::now();
+    [self.imageAnalyzer processRequest:requestForContextMenu progressHandler:nil completionHandler:[requestIdentifier = WTFMove(requestIdentifier), weakSelf = WeakObjCPtr<WKContentView>(self), hasTextResults, visualSearchAnalysisStartTime] (VKImageAnalysis *result, NSError *error) mutable {
+        auto strongSelf = weakSelf.get();
+        if (![strongSelf validateImageAnalysisRequestIdentifier:requestIdentifier])
+            return;
+
 #if USE(QUICK_LOOK)
-                    BOOL hasVisualSearchResults = [result hasResultsForAnalysisTypes:VKAnalysisTypeVisualSearch];
-                    RELEASE_LOG(Images, "Image analysis completed in %.0f ms (request %" PRIu64 "; found visual search results? %d)", (MonotonicTime::now() - visualSearchAnalysisStartTime).milliseconds(), requestIdentifier.toUInt64(), hasVisualSearchResults);
+        BOOL hasVisualSearchResults = [result hasResultsForAnalysisTypes:VKAnalysisTypeVisualSearch];
+        RELEASE_LOG(Images, "Image analysis completed in %.0f ms (request %" PRIu64 "; found visual search results? %d)", (MonotonicTime::now() - visualSearchAnalysisStartTime).milliseconds(), requestIdentifier.toUInt64(), hasVisualSearchResults);
 #else
-                    UNUSED_PARAM(visualSearchAnalysisStartTime);
+        UNUSED_PARAM(visualSearchAnalysisStartTime);
 #endif
-                    if (!result || error) {
-                        [strongSelf _invokeAllActionsToPerformAfterPendingImageAnalysis:WebKit::ProceedWithTextSelectionInImage::No];
-                        return;
-                    }
+        if (!result || error) {
+            [strongSelf _invokeAllActionsToPerformAfterPendingImageAnalysis:WebKit::ProceedWithTextSelectionInImage::No];
+            return;
+        }
 
 #if USE(QUICK_LOOK)
-                    strongSelf->_hasSelectableTextInImage = hasTextResults;
-                    strongSelf->_hasVisualSearchResults = hasVisualSearchResults;
+        strongSelf->_hasSelectableTextInImage = hasTextResults;
+        strongSelf->_hasVisualSearchResults = hasVisualSearchResults;
 #else
-                    UNUSED_PARAM(hasTextResults);
+        UNUSED_PARAM(hasTextResults);
 #endif
 #if USE(UICONTEXTMENU) && ENABLE(IMAGE_ANALYSIS_FOR_MACHINE_READABLE_CODES)
-                    [strongSelf _updateContextMenuForMachineReadableCodeForImageAnalysis:result];
+        [strongSelf _updateContextMenuForMachineReadableCodeForImageAnalysis:result];
 #endif // USE(UICONTEXTMENU) && ENABLE(IMAGE_ANALYSIS_FOR_MACHINE_READABLE_CODES)
-                    [strongSelf _invokeAllActionsToPerformAfterPendingImageAnalysis:WebKit::ProceedWithTextSelectionInImage::No];
-                }];
-            });
-        }];
-    } forRequest:request];
+        [strongSelf _invokeAllActionsToPerformAfterPendingImageAnalysis:WebKit::ProceedWithTextSelectionInImage::No];
+    }];
 }
 
 - (void)imageAnalysisGestureDidFail:(WKImageAnalysisGestureRecognizer *)gestureRecognizer

Modified: trunk/Source/WebKit/WebProcess/WebPage/ios/WebPageIOS.mm (288031 => 288032)


--- trunk/Source/WebKit/WebProcess/WebPage/ios/WebPageIOS.mm	2022-01-14 21:58:40 UTC (rev 288031)
+++ trunk/Source/WebKit/WebProcess/WebPage/ios/WebPageIOS.mm	2022-01-14 22:23:35 UTC (rev 288032)
@@ -2780,6 +2780,8 @@
 
 static void videoPositionInformation(WebPage& page, HTMLVideoElement& element, const InteractionInformationRequest& request, InteractionInformationAtPosition& info)
 {
+    info.elementContainsImageOverlay = ImageOverlay::hasOverlay(element);
+
     if (!element.paused())
         return;
 
@@ -2820,6 +2822,7 @@
     info.isImage = true;
     info.imageURL = element.document().completeURL(renderImage.cachedImage()->url().string());
     info.isAnimatedImage = image.isAnimated();
+    info.elementContainsImageOverlay = is<HTMLElement>(element) && ImageOverlay::hasOverlay(downcast<HTMLElement>(element));
 
     if (request.includeSnapshot || request.includeImageData)
         info.image = createShareableBitmap(renderImage, { screenSize() * page.corePage()->deviceScaleFactor(), AllowAnimatedImages::Yes, UseSnapshotForTransparentImages::Yes });

Modified: trunk/Tools/ChangeLog (288031 => 288032)


--- trunk/Tools/ChangeLog	2022-01-14 21:58:40 UTC (rev 288031)
+++ trunk/Tools/ChangeLog	2022-01-14 22:23:35 UTC (rev 288032)
@@ -1,3 +1,35 @@
+2022-01-14  Wenson Hsieh  <wenson_hs...@apple.com>
+
+        Avoid redundant text analysis requests when long pressing inside an image that contains Live Text
+        https://bugs.webkit.org/show_bug.cgi?id=235129
+        rdar://87366539
+
+        Reviewed by Tim Horton.
+
+        Add a new API test and refactor some existing tests.
+
+        * TestWebKitAPI/Tests/WebKitCocoa/ImageAnalysisTests.mm:
+        (swizzledLocationInView):
+        (-[TestWKWebView simulateImageAnalysisGesture:]):
+
+        Add a helper method to simulate the image analysis gesture recognizer being activated on iOS, and wait for image
+        analysis to finish.
+
+        (TestWebKitAPI::swizzledProcessRequestWithResults):
+        (TestWebKitAPI::TEST):
+
+        Add a new API test to verify that we make at most one extra VKImageAnalyzer request when invoking a non-Live-
+        Text part of the image a second time. Additionally, adjust an existing test (HandleImageAnalyzerErrors) to
+        verify that we invoke VKImageAnalyzer a total of two times during the course of one image analysis gesture (once
+        for Live Text recognition, and another for visual look up). This test currently passes with only a single check
+        because it ends too early (i.e. after only one round trip to the web process and back).
+
+        (TestWebKitAPI::swizzledLocationInView): Deleted.
+        * TestWebKitAPI/cocoa/ImageAnalysisTestingUtilities.mm:
+        (-[TestVKImageAnalysis hasResultsForAnalysisTypes:]):
+
+        Implement this method stub to avoid an unrecognized selector crash.
+
 2022-01-13  Jonathan Bedard  <jbed...@apple.com>
 
         [EWS] Support pull-requests in ValidateChange

Modified: trunk/Tools/TestWebKitAPI/Tests/WebKitCocoa/ImageAnalysisTests.mm (288031 => 288032)


--- trunk/Tools/TestWebKitAPI/Tests/WebKitCocoa/ImageAnalysisTests.mm	2022-01-14 21:58:40 UTC (rev 288031)
+++ trunk/Tools/TestWebKitAPI/Tests/WebKitCocoa/ImageAnalysisTests.mm	2022-01-14 22:23:35 UTC (rev 288032)
@@ -39,8 +39,25 @@
 
 static unsigned gDidProcessRequestCount = 0;
 
+#if PLATFORM(IOS_FAMILY)
+
+static CGPoint gSwizzledLocationInView = CGPointZero;
+static CGPoint swizzledLocationInView(id, SEL, UIView *)
+{
+    return gSwizzledLocationInView;
+}
+
+@interface UIView (ImageAnalysisTesting)
+- (void)imageAnalysisGestureDidBegin:(UIGestureRecognizer *)gestureRecognizer;
+@end
+
+#endif // PLATFORM(IOS_FAMILY)
+
 @interface TestWKWebView (ImageAnalysisTests)
 - (void)waitForImageAnalysisRequests:(unsigned)numberOfRequests;
+#if PLATFORM(IOS_FAMILY)
+- (unsigned)simulateImageAnalysisGesture:(CGPoint)location;
+#endif
 @end
 
 @implementation TestWKWebView (ImageAnalysisTests)
@@ -55,6 +72,22 @@
     EXPECT_EQ(gDidProcessRequestCount, numberOfRequests);
 }
 
+#if PLATFORM(IOS_FAMILY)
+
+- (unsigned)simulateImageAnalysisGesture:(CGPoint)location
+{
+    auto numberOfRequestsAtStart = gDidProcessRequestCount;
+    gSwizzledLocationInView = location;
+    InstanceMethodSwizzler gestureLocationSwizzler { UILongPressGestureRecognizer.class, @selector(locationInView:), reinterpret_cast<IMP>(swizzledLocationInView) };
+    [self.textInputContentView imageAnalysisGestureDidBegin:self._imageAnalysisGestureRecognizer];
+    // The process of image analysis involves at most 2 round trips to the web process.
+    [self waitForNextPresentationUpdate];
+    [self waitForNextPresentationUpdate];
+    return gDidProcessRequestCount - numberOfRequestsAtStart;
+}
+
+#endif // PLATFORM(IOS_FAMILY)
+
 @end
 
 namespace TestWebKitAPI {
@@ -74,18 +107,11 @@
 static void swizzledProcessRequestWithResults(id, SEL, VKImageAnalyzerRequest *, void (^)(double progress), void (^completion)(VKImageAnalysis *, NSError *))
 {
     gDidProcessRequestCount++;
-    auto analysis = createImageAnalysisWithSimpleFixedResults();
-    completion(analysis.get(), nil);
+    completion(createImageAnalysisWithSimpleFixedResults().get(), nil);
 }
 
 #if PLATFORM(IOS_FAMILY)
 
-static CGPoint gSwizzledLocationInView = CGPoint { 100, 100 };
-static CGPoint swizzledLocationInView(id, SEL, UIView *)
-{
-    return gSwizzledLocationInView;
-}
-
 static void swizzledProcessRequestWithError(id, SEL, VKImageAnalyzerRequest *, void (^)(double progress), void (^completion)(VKImageAnalysis *analysis, NSError *error))
 {
     gDidProcessRequestCount++;
@@ -94,46 +120,48 @@
 
 TEST(ImageAnalysisTests, DoNotAnalyzeImagesInEditableContent)
 {
-    InstanceMethodSwizzler gestureLocationSwizzler { UILongPressGestureRecognizer.class, @selector(locationInView:), reinterpret_cast<IMP>(swizzledLocationInView) };
     InstanceMethodSwizzler imageAnalysisRequestSwizzler { PAL::getVKImageAnalyzerClass(), @selector(processRequest:progressHandler:completionHandler:), reinterpret_cast<IMP>(swizzledProcessRequestWithError) };
 
     auto webView = adoptNS([[TestWKWebView alloc] initWithFrame:CGRectMake(0, 0, 400, 400)]);
     [webView _setEditable:YES];
     [webView synchronouslyLoadTestPageNamed:@"image"];
-
-    [webView _imageAnalysisGestureRecognizer].state = UIGestureRecognizerStateBegan;
-    [webView waitForNextPresentationUpdate];
-    EXPECT_EQ(gDidProcessRequestCount, 0U);
+    EXPECT_EQ([webView simulateImageAnalysisGesture:CGPointMake(100, 100)], 0U);
 }
 
-TEST(ImageAnalysisTests, HandleImageAnalyzerError)
+TEST(ImageAnalysisTests, HandleImageAnalyzerErrors)
 {
-    InstanceMethodSwizzler gestureLocationSwizzler { UILongPressGestureRecognizer.class, @selector(locationInView:), reinterpret_cast<IMP>(swizzledLocationInView) };
     InstanceMethodSwizzler imageAnalysisRequestSwizzler { PAL::getVKImageAnalyzerClass(), @selector(processRequest:progressHandler:completionHandler:), reinterpret_cast<IMP>(swizzledProcessRequestWithError) };
 
     auto webView = adoptNS([[TestWKWebView alloc] initWithFrame:CGRectMake(0, 0, 400, 400)]);
     [webView synchronouslyLoadTestPageNamed:@"image"];
 
-    [webView _imageAnalysisGestureRecognizer].state = UIGestureRecognizerStateBegan;
-    [webView waitForNextPresentationUpdate];
-    EXPECT_EQ(gDidProcessRequestCount, 1U);
+    EXPECT_EQ([webView simulateImageAnalysisGesture:CGPointMake(100, 100)], 2U);
 }
 
 TEST(ImageAnalysisTests, DoNotCrashWhenHitTestingOutsideOfWebView)
 {
-    InstanceMethodSwizzler gestureLocationSwizzler { UILongPressGestureRecognizer.class, @selector(locationInView:), reinterpret_cast<IMP>(swizzledLocationInView) };
     InstanceMethodSwizzler imageAnalysisRequestSwizzler { PAL::getVKImageAnalyzerClass(), @selector(processRequest:progressHandler:completionHandler:), reinterpret_cast<IMP>(swizzledProcessRequestWithError) };
 
     auto webView = adoptNS([[TestWKWebView alloc] initWithFrame:CGRectMake(0, 0, 400, 400)]);
     [webView synchronouslyLoadTestPageNamed:@"image"];
 
-    gSwizzledLocationInView = CGPointMake(500, 500);
-    [webView _imageAnalysisGestureRecognizer].state = UIGestureRecognizerStateBegan;
-    [webView waitForNextPresentationUpdate];
+    EXPECT_EQ([webView simulateImageAnalysisGesture:CGPointMake(500, 500)], 0U);
     [webView expectElementCount:1 querySelector:@"img"];
-    EXPECT_EQ(gDidProcessRequestCount, 0U);
 }
 
+TEST(ImageAnalysisTests, AvoidRedundantTextRecognitionRequests)
+{
+    InstanceMethodSwizzler imageAnalysisRequestSwizzler { PAL::getVKImageAnalyzerClass(), @selector(processRequest:progressHandler:completionHandler:), reinterpret_cast<IMP>(swizzledProcessRequestWithResults) };
+
+    auto webView = createWebViewWithTextRecognitionEnhancements();
+    [webView synchronouslyLoadTestPageNamed:@"image"];
+
+    EXPECT_EQ([webView simulateImageAnalysisGesture:CGPointMake(150, 100)], 1U);
+
+    // FIXME: If we cache visual look up results as well in the future, we can bring this down to 0 (that is, no new requests).
+    EXPECT_LT([webView simulateImageAnalysisGesture:CGPointMake(150, 250)], 2U);
+}
+
 #endif // PLATFORM(IOS_FAMILY)
 
 TEST(ImageAnalysisTests, StartImageAnalysisWithoutIdentifier)

Modified: trunk/Tools/TestWebKitAPI/cocoa/ImageAnalysisTestingUtilities.mm (288031 => 288032)


--- trunk/Tools/TestWebKitAPI/cocoa/ImageAnalysisTestingUtilities.mm	2022-01-14 21:58:40 UTC (rev 288031)
+++ trunk/Tools/TestWebKitAPI/cocoa/ImageAnalysisTestingUtilities.mm	2022-01-14 22:23:35 UTC (rev 288032)
@@ -111,6 +111,9 @@
 
 @interface TestVKImageAnalysis : NSObject
 - (instancetype)initWithLines:(NSArray<VKWKLineInfo *> *)lines;
+#if HAVE(VK_IMAGE_ANALYSIS_FOR_MACHINE_READABLE_CODES)
+@property (nonatomic, weak) UIViewController *presentingViewControllerForMrcAction;
+#endif
 @end
 
 @implementation TestVKImageAnalysis {
@@ -131,6 +134,12 @@
     return _lines.get();
 }
 
+- (BOOL)hasResultsForAnalysisTypes:(VKAnalysisTypes)analysisTypes
+{
+    // We only simulate text results for the time being.
+    return analysisTypes == VKAnalysisTypeText && [_lines count];
+}
+
 @end
 
 namespace TestWebKitAPI {
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to