Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RFC] Using Detox for Interaction Tests of React Native Core #589

Closed
elicwhite opened this issue Feb 28, 2018 · 21 comments
Closed

[RFC] Using Detox for Interaction Tests of React Native Core #589

elicwhite opened this issue Feb 28, 2018 · 21 comments

Comments

@elicwhite
Copy link

elicwhite commented Feb 28, 2018

The problem

The ScrollView component in RN for iOS is over 1,000 lines and riddled with comments about how certain lines fix random visual bugs.

  • We have no tests for these edge cases, and thus no confidence that open source PRs don't break them.
  • We have no confidence that new versions of the operating system don't cause regressions.
  • When trying to get feature parity on the other platform, we have no idea if we sufficiently match the implementation.

This is idiomatic of pretty much every component in React Native core and thus slows us down drastically from being able to confidently make improvements or consider merging PRs from the community.

The reason these things aren't tested is because they are visual things that require interaction, often gestures, and none of our existing testing tools give us that functionality

Examples

Here are some examples of things that are really hard to test with our existing tools:

  • The behavior of overScrollMode on ScrollView. If auto, overscroll is allowed if the content is large enough to meaningfully scroll, always = always, never = never.
    • I'd imagine two tests for auto, one with a small child, one with a large child. The test would render the scrollview, take a screenshot, do a single finger drag, and before letting go we take another screenshot to capture how far we were able to overscroll (this could be effected by the amount of tension in the overscroll spring), let go and once it settles (timeout), take another screenshot to capture where it settled to.
  • minimumZoomScale / maximumZoomScale on scrollView
    • Very similar to above, but it requires a two finger gesture. Can we zoom past? Does it snap back to the max / min?
  • Press state on TouchableOpacity
    • Instead of mimicking press by passing through a test only prop that renders the pressed state, we'd want to be able to take a screenshot of the pressed state by going through the normal interaction flow a user would have.

Solution

We think Detox hits a place where it can really help us solve this. it seems like Detox gives us access to almost everything we need already but is missing the pieces that really unlock this kind of testing.

  1. We need support for taking screenshots. It looks like there is an open issue for this that has some workarounds.
  2. We need to be able to do point to point drags. Looks like there is an open issue for that as well. Something that would be important for us is the ability to take a screenshot before releasing the touch.
  3. We need to be able to do gestures like pinch to zoom. I don't see an open issue for that. Like 2 above, we need to be able to take a screenshot before releasing the gesture.

RFC

I'd love to hear whether you believe detox is the right tool to solve this problem or if you have other recommendations we should investigate. Are the things we would need to accomplish this reasonable? Are there other major pitfalls we should be considering?

@rotemmiz
Copy link
Member

Hey there,
I do believe Detox is the tool for these kind of tests, can't see it solvable using unit tests.

  1. There's a comprehensive artifact recording solution through [this PR] ([WIP] Screenshots and screen recordings of tests #541), we still have performance issues with it, that's why it's not already merged. Regarding the "taking screenshots during actions" I'm not sure how an API of this kind should look like, how can we stop the action halfway? How do we synchronize the action with the screenshot to make sure it is taken exactly when we need it ?
  2. Seems like it's a solved issue in Espresso, EarlGrey ?
  3. Seems like it's a solved issue in Espresso and EarlGrey. We need to use create/use these actions and export their API to Detox JS.

PRs for 2 and 3 are more than welcome!! I believe we can do it, but can't commit to a date. If someone wants to take it, we will be happy to mentor.

I know @LeoNatan has a few takes about this as well, please share them kind sir.

@LeoNatan
Copy link
Contributor

@rotemmiz Regarding your 1., it is much more easier to design an API in the native side, as there every action is synchronous, and we can put the screenshot taking step anywhere in the touch process (before, on touch down, on touch up, after idle, etc.).
Perhaps a purposeful API can be give for certain actions (such as tap, scrollTo, etc.) where we can provide a {snapshot: <when>} parameter and somehow implement it in native. It's not a general purpose API, but it may be good enough.

Regarding 2,3, they are solvable and PRs are welcome, but I'd like to first have a clear idea of what the API would be. I am not a fan of absolute values. Tests written with tapAtPoint(x,y) or dragFromTo(x1, y1, x2, y2) are flaky and depend on the underlying layout system (be it native UIKit/Android or RN or both). This has come up a lot, so there is clearly demand for such an API. I think, at the very minimum, there should be two modes of working, absolute and percent-based. Would be great if I could express "start from the middle and pan for 20% up" or "with two fingers in the middle, pinch out 35%". I think this kind of API for test writers should be much more resilient with regards to layout and OS style changes and device size changes.

I don't think even Earl Grey has such API, but I don't mind that, we can do the heavy lifting of math calculations on our own in Native. Just need an expressive enough public API in JS.

Discussion and proposals are welcome.

@elicwhite
Copy link
Author

elicwhite commented Feb 28, 2018

It seems like the most challenging thing here is figuring out what an API looks like for taking screenshots during an action.

@LeoNatan I'm not so sure that we need something generic enough to take a screenshot anywhere in the touch process. Tests that take a screenshot at the beginning or middle of an animation are almost always flaky. If the screenshot happens 1ms later than the previous run, the screenshots might not match. I think it is important to focus on being able to take screenshots when things have settled. That may even require tying into Animated to know when nothing left is running.

If our intention is to be able to take screenshots at the settled press state of a button, we would need to use something fundamentally different than a tap api which would likely execute too fast.

I'm a bit worried about bikeshedding around APIs for this since we know what the requirements are but it seems like there are a few different approaches that could have tradeoffs we don't fully understand yet. It seems like it would be great to have detox provide the basic primitives and let us experiment with what those APIs look like, perhaps doing some more complicated math in JS. I could easily imagine that becoming a helper library on top of Detox. This approach might even let us solve the problem of waiting for Animated to finish if a JS library can call the next step of the animation in an Animated completion callback.

@LeoNatan
Copy link
Contributor

We already know when animations are over. If all that is needed is screenshot on idle, we shall have that soon from the PR.

All is done in native. Does not run on device from Detox, only the tester. Detox syncs and performs matching and actions in native. Thus the JS API has to be thought out first before implementing native.

And it's always the API that is hardest, haha.

@LeoNatan
Copy link
Contributor

LeoNatan commented Feb 28, 2018

It's not bikeshedding, it's very important for our users 😉
The API has to be resilient, otherwise we see an influx of questions about these APIs.

@elicwhite
Copy link
Author

:)

What I mean by making the primitives available means being able to write tests like this:

describe('touchableHighlight', () => {
  it('should have the correct pressed state', async () => {
    const button = await element(by.id('testButton'));
    const touch = await button.touchDown();
    // take a snapshot
    await touch.pressUp();
  });

  it('should not call press if touch moved outside component', () => {
    const button = await element(by.id('testButton'));
    const position = await button.getPositionAndBounds();
    const componentCenter = position.getCenter();
    const touch = await button.pressDown(componentCenter);
    const belowComponentLocation = position.addY(position.getHeight);
    touch.moveTo(belowComponentLocation)
    // take a snapshot to see what the behavior is when no longer on top of the button
    await touch.pressUp();
    await expect(element(by.id('answer'))).toHaveText('not pressed');
  });
});

If there are cleaner APIs for doing specific things like drag from one component to another, they could be made in userland using these primitives.

@elicwhite
Copy link
Author

Do you agree with this type of API and we should move forward trying to figure out the specifics of the API?

Or do you think it should be something higher level like:

element(by.id('button')).swipeDown({
  distance: '10%',
  screenshot: {
    beforeSwipe: true,
    afterSettleBeforeRelease: true
  }
});

@LeoNatan
Copy link
Contributor

LeoNatan commented Mar 6, 2018

I think the latter API makes a lot more sense for Detox. We don't have element positions in the JS part of Detox, so it makes little sense to deal there with coordinates. The latter API seems like a good direction.

@RossKinsella
Copy link

element(by.id('button')).swipeDown looks like a good solution for my current issue: https://stackoverflow.com/questions/49330906/detox-flatlist-not-scrolling

@stale
Copy link

stale bot commented May 1, 2018

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
If you believe the issue is still relevant, please test on the latest Detox and report back.
Thank you for your contributions.

@stale stale bot added the 🏚 stale label May 1, 2018
@elicwhite
Copy link
Author

Stale bot is reminding me that I'm sad that we dropped this.

I feel like I gave up on this because I don't see a path forward with a high level API that gives us the flexibility we need to test the functionality React Native is looking to test, and it sounds like Detox doesn't want to support a low level API to do this.

Is that consistent with the way your team is thinking about this? If not, and this is something you want to support but only from a high level API, perhaps you could provide some potential examples of an API that would support both of the following use cases:

Press state on TouchableOpacity

Press down on a button and take a screenshot of the element once the press animation completes.

Testing minimumZoomScale / maximumZoomScale on scrollView

  • Using a two finger pinch zoom, zoom out beyond the minimumZoomScale, take a screenshot to verify that the background of the scroll view continues into the overscroll area.
  • Release the pinch and take a screenshot when settled to verify that the scroll view ended at the minimumZoomScale. Do the same for maximumZoomScale

@bootstraponline
Copy link

I think exposing the low level API primitives makes sense. Appium took the approach of defining high level touch actions and that wasn't very flexible.

@elicwhite
Copy link
Author

@bootstraponline, I took a closer look at the link you provided to Appium's docs. It seems like everything we want to test would be possible with that API. Can you provide some more insight as to how Appium's API isn't flexible enough? What use cases does it not handle?

@bootstraponline
Copy link

bootstraponline commented Jun 18, 2018

If the API is flexible enough, that's great.

Appium's API doesn't handle the use case of using the scrolling methods from the underlying drivers (EarlGrey/Espresso) directly. If the existing API doesn't do what you want, then you're stuck. The detox approach is based on creating bindings to the native methods, this avoids a lowest common denominator situation where platform specific scrolling methods aren't available.

https://github.com/wix/detox/blob/master/docs/More.DesignPrinciples.md explains the design differences between detox & WebDriver based solutions such as appium.

@rotemmiz
Copy link
Member

rotemmiz commented Jul 8, 2018

@bootstraponline seems like you have something in mind, how would you define that API ?

@bootstraponline
Copy link

I'd take the existing scrolling methods for EarlGrey/Espresso and make them callable via JavaScript. These are listed under gestures in the cheatsheet.

Espresso

  • scrollTo()
  • swipeLeft()
  • swipeRight()
  • swipeUp()
  • swipeDown()

EarlGrey

  • grey_scrollInDirection()
  • grey_scrollInDirectionWithStartPoint()
  • grey_scrollToContentEdge()
  • grey_scrollToContentEdgeWithStartPoint()
  • etc.

By exposing the underlying primitives then users can compose them into higher level APIs as needed.

@elicwhite
Copy link
Author

@bootstraponline, thanks for taking a stab at proposal.

Since this issue is specifically about how to use detox to test the core React Native components, can you go through the specific examples I listed and describe how tests could be written for those using these new APIs?

@bootstraponline
Copy link

I think it'd look something like this.

Press state on TouchableOpacity

Press down on a button and take a screenshot of the element once the press animation completes.

await element(by.text('button')).pressDown();
await device.screenshot();

Testing minimumZoomScale / maximumZoomScale on scrollView

Using a two finger pinch zoom, zoom out beyond the minimumZoomScale, take a screenshot to verify that the background of the scroll view continues into the overscroll area.
Release the pinch and take a screenshot when settled to verify that the scroll view ended at the minimumZoomScale. Do the same for maximumZoomScale

await element(by.id('scrollView')).pinchOut();
await device.screenshot();
await element(by.id('scrollView')).pinchIn();
await device.screenshot();

In going through the specific examples, these methods don't exist in Espresso/EarlGrey. If we have to build a custom API for detox anyway, then maybe following along with the Multi-Action API from Mozilla isn't a bad idea. Similar methods do exist in appium/selendroid/WebDriverAgent/Xamarin.UITest. It'd probably be worth reviewing all the existing APIs and then figuring out what makes sense for detox.

Appium works with react native, maybe building out a PoC using the existing API there would be a useful next step.

@elicwhite
Copy link
Author

We are working on the implementation of setNativeProps for Fabric and would love to be able to write some tests using detox, but I believe we don't have the APIs necessary to write these tests. Specifically, we need to access properties (like background color) on the native underlying view (like UIView).

setNativeProps has some semantics that are simple to implement in today's world but require more tracking internally to continue having the same semantics in Fabric. This complexity comes from a number of the layers that props flow through and get diffed and handled in steps: React Native Product code, React Core, the Shadow Tree, and the native views. We want to make sure these edge cases are handled in specific ways so we'd love to write tests for this.

This is an example of a test I want to be able to write:

start background and border of view as blue
change background to green with setNativeProps
change border to green with setNativeProps
change background to purple with state
border should be still be green

Here is a low quality gif of such a test:
setnativeprops demo 1

The main thing I don't think we have in detox is access to the native view properties to be able to write expectations on properties like border color or background color. I believe that @rotemmiz and I talked about this in the past but wanted to bring it back up as we have a very specific need for this now.

@rotemmiz
Copy link
Member

@TheSavior Can you point me to a code snippet of those properties you want to check (preferably in this specific usecase)?

@LeoNatan
Copy link
Contributor

Unfortunately, I think this is out of scope, at least for now. We just don't have the resources to even start consider a solution that would be acceptable here, let alone implement. I think it's better to keep Detox focused on what it does best.

@lock lock bot locked as resolved and limited conversation to collaborators Jun 30, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants