-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RFC] Using Detox for Interaction Tests of React Native Core #589
Comments
Hey there,
PRs for 2 and 3 are more than welcome!! I believe we can do it, but can't commit to a date. If someone wants to take it, we will be happy to mentor. I know @LeoNatan has a few takes about this as well, please share them kind sir. |
@rotemmiz Regarding your 1., it is much more easier to design an API in the native side, as there every action is synchronous, and we can put the screenshot taking step anywhere in the touch process (before, on touch down, on touch up, after idle, etc.). Regarding 2,3, they are solvable and PRs are welcome, but I'd like to first have a clear idea of what the API would be. I am not a fan of absolute values. Tests written with I don't think even Earl Grey has such API, but I don't mind that, we can do the heavy lifting of math calculations on our own in Native. Just need an expressive enough public API in JS. Discussion and proposals are welcome. |
It seems like the most challenging thing here is figuring out what an API looks like for taking screenshots during an action. @LeoNatan I'm not so sure that we need something generic enough to take a screenshot anywhere in the touch process. Tests that take a screenshot at the beginning or middle of an animation are almost always flaky. If the screenshot happens 1ms later than the previous run, the screenshots might not match. I think it is important to focus on being able to take screenshots when things have settled. That may even require tying into Animated to know when nothing left is running. If our intention is to be able to take screenshots at the settled press state of a button, we would need to use something fundamentally different than a I'm a bit worried about bikeshedding around APIs for this since we know what the requirements are but it seems like there are a few different approaches that could have tradeoffs we don't fully understand yet. It seems like it would be great to have detox provide the basic primitives and let us experiment with what those APIs look like, perhaps doing some more complicated math in JS. I could easily imagine that becoming a helper library on top of Detox. This approach might even let us solve the problem of waiting for Animated to finish if a JS library can call the next step of the animation in an Animated completion callback. |
We already know when animations are over. If all that is needed is screenshot on idle, we shall have that soon from the PR. All is done in native. Does not run on device from Detox, only the tester. Detox syncs and performs matching and actions in native. Thus the JS API has to be thought out first before implementing native. And it's always the API that is hardest, haha. |
It's not bikeshedding, it's very important for our users 😉 |
:) What I mean by making the primitives available means being able to write tests like this:
If there are cleaner APIs for doing specific things like drag from one component to another, they could be made in userland using these primitives. |
Do you agree with this type of API and we should move forward trying to figure out the specifics of the API? Or do you think it should be something higher level like:
|
I think the latter API makes a lot more sense for Detox. We don't have element positions in the JS part of Detox, so it makes little sense to deal there with coordinates. The latter API seems like a good direction. |
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. |
Stale bot is reminding me that I'm sad that we dropped this. I feel like I gave up on this because I don't see a path forward with a high level API that gives us the flexibility we need to test the functionality React Native is looking to test, and it sounds like Detox doesn't want to support a low level API to do this. Is that consistent with the way your team is thinking about this? If not, and this is something you want to support but only from a high level API, perhaps you could provide some potential examples of an API that would support both of the following use cases: Press state on TouchableOpacityPress down on a button and take a screenshot of the element once the press animation completes. Testing minimumZoomScale / maximumZoomScale on scrollView
|
I think exposing the low level API primitives makes sense. Appium took the approach of defining high level touch actions and that wasn't very flexible. |
@bootstraponline, I took a closer look at the link you provided to Appium's docs. It seems like everything we want to test would be possible with that API. Can you provide some more insight as to how Appium's API isn't flexible enough? What use cases does it not handle? |
If the API is flexible enough, that's great. Appium's API doesn't handle the use case of using the scrolling methods from the underlying drivers (EarlGrey/Espresso) directly. If the existing API doesn't do what you want, then you're stuck. The detox approach is based on creating bindings to the native methods, this avoids a lowest common denominator situation where platform specific scrolling methods aren't available. https://github.com/wix/detox/blob/master/docs/More.DesignPrinciples.md explains the design differences between detox & WebDriver based solutions such as appium. |
@bootstraponline seems like you have something in mind, how would you define that API ? |
I'd take the existing scrolling methods for EarlGrey/Espresso and make them callable via JavaScript. These are listed under gestures in the cheatsheet.
Espresso
EarlGrey
By exposing the underlying primitives then users can compose them into higher level APIs as needed. |
@bootstraponline, thanks for taking a stab at proposal. Since this issue is specifically about how to use detox to test the core React Native components, can you go through the specific examples I listed and describe how tests could be written for those using these new APIs? |
I think it'd look something like this. Press state on TouchableOpacity
await element(by.text('button')).pressDown();
await device.screenshot(); Testing minimumZoomScale / maximumZoomScale on scrollView
await element(by.id('scrollView')).pinchOut();
await device.screenshot();
await element(by.id('scrollView')).pinchIn();
await device.screenshot(); In going through the specific examples, these methods don't exist in Espresso/EarlGrey. If we have to build a custom API for detox anyway, then maybe following along with the Multi-Action API from Mozilla isn't a bad idea. Similar methods do exist in appium/selendroid/WebDriverAgent/Xamarin.UITest. It'd probably be worth reviewing all the existing APIs and then figuring out what makes sense for detox. Appium works with react native, maybe building out a PoC using the existing API there would be a useful next step. |
We are working on the implementation of setNativeProps for Fabric and would love to be able to write some tests using detox, but I believe we don't have the APIs necessary to write these tests. Specifically, we need to access properties (like background color) on the native underlying view (like UIView). setNativeProps has some semantics that are simple to implement in today's world but require more tracking internally to continue having the same semantics in Fabric. This complexity comes from a number of the layers that props flow through and get diffed and handled in steps: React Native Product code, React Core, the Shadow Tree, and the native views. We want to make sure these edge cases are handled in specific ways so we'd love to write tests for this. This is an example of a test I want to be able to write:
Here is a low quality gif of such a test: The main thing I don't think we have in detox is access to the native view properties to be able to write expectations on properties like border color or background color. I believe that @rotemmiz and I talked about this in the past but wanted to bring it back up as we have a very specific need for this now. |
@TheSavior Can you point me to a code snippet of those properties you want to check (preferably in this specific usecase)? |
Unfortunately, I think this is out of scope, at least for now. We just don't have the resources to even start consider a solution that would be acceptable here, let alone implement. I think it's better to keep Detox focused on what it does best. |
The problem
The ScrollView component in RN for iOS is over 1,000 lines and riddled with comments about how certain lines fix random visual bugs.
This is idiomatic of pretty much every component in React Native core and thus slows us down drastically from being able to confidently make improvements or consider merging PRs from the community.
The reason these things aren't tested is because they are visual things that require interaction, often gestures, and none of our existing testing tools give us that functionality
Examples
Here are some examples of things that are really hard to test with our existing tools:
Solution
We think Detox hits a place where it can really help us solve this. it seems like Detox gives us access to almost everything we need already but is missing the pieces that really unlock this kind of testing.
RFC
I'd love to hear whether you believe detox is the right tool to solve this problem or if you have other recommendations we should investigate. Are the things we would need to accomplish this reasonable? Are there other major pitfalls we should be considering?
The text was updated successfully, but these errors were encountered: