Long upload times and timeouts for GHA codecov/codecov-action@v1

Another example:

  1. All Mongo tests passing:
    https://github.com/apereo/cas/runs/842960465?check_suite_focus=true

  2. Codecov showing 0 test coverage:
    https://codecov.io/gh/apereo/cas/tree/77845983f83126da061cefde2142b1ea8c5714e3/support/cas-server-support-audit-mongo/src/main/java/org/apereo/cas

  3. Similar warnings as above:

2020-07-06T20:42:05.7105162Z ##[group]Run codecov/codecov-action@v1.0.10
2020-07-06T20:42:05.7105405Z with:
2020-07-06T20:42:05.7105590Z   flags: mongo
2020-07-06T20:42:05.7105835Z   fail_***_if_error: true
2020-07-06T20:42:05.7106013Z env:
2020-07-06T20:42:05.7106230Z   JAVA_OPTS: -Xms512m -Xmx6048m -Xss128m -XX:ReservedCodeCacheSize=512m -server -XX:+UseG1GC
2020-07-06T20:42:05.7106496Z   GRADLE_OPTS: -Xms512m -Xmx6048m -Xss128m -XX:ReservedCodeCacheSize=512m -server -XX:+UseG1GC
2020-07-06T20:42:05.7106716Z   TERM: xterm-256color
2020-07-06T20:42:05.7106912Z   SONATYPE_USER: ***
2020-07-06T20:42:05.7107122Z   SONATYPE_PWD: ***
2020-07-06T20:42:05.7107335Z   GH_PAGES_TOKEN: ***
2020-07-06T20:42:05.7107556Z   RENOVATE_TOKEN: ***
2020-07-06T20:42:05.7107743Z   GRADLE_BUILDCACHE_USER: ***
2020-07-06T20:42:05.7107938Z   GRADLE_BUILDCACHE_PSW: ***
2020-07-06T20:42:05.7108148Z   PYTHON_VERSION: 3.8.2
2020-07-06T20:42:05.7108359Z   JDK_CURRENT: 11.0.7
2020-07-06T20:42:05.7108529Z   JDK_LATEST: 14
2020-07-06T20:42:05.7109075Z   JAVA_HOME: /opt/hostedtoolcache/jdk/11.0.7/x64
2020-07-06T20:42:05.7109293Z   JAVA_HOME_11.0.7_x64: /opt/hostedtoolcache/jdk/11.0.7/x64
2020-07-06T20:42:05.7109485Z ##[endgroup]
2020-07-06T20:53:17.3790476Z (node:6613) UnhandledPromiseRejectionWarning: Error: connect ETIMEDOUT 35.199.43.247:443
2020-07-06T20:53:17.3790976Z     at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1129:14)
2020-07-06T20:53:17.3791563Z (node:6613) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
2020-07-06T20:53:17.3793323Z (node:6613) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
2020-07-06T20:53:23.0976179Z Post job cleanup.

Hi @mmoayyed, I updated the action to 1.0.12. Would you be able to see if this is still an issue?

Thanks. I updated the action and kick off another run:
https://github.com/apereo/cas/actions/runs/178699099

Will post results shortly.

Failed:
https://github.com/apereo/cas/runs/899319279?check_suite_focus=true

Failed:
https://github.com/apereo/cas/runs/899319576?check_suite_focus=true

More failures popping up, for pull requests:
https://github.com/apereo/cas/actions/runs/180257634

Hi @mmoayyed, thanks here, we’re still working on fixes to reduce failures. Will update when we push some changes.

No problem. Look forward to trying out new updates!

Hi @mmoayyed, we made some updates to the underlying bash script, please let me know if you still see issues.

Thanks @tom. I have kicked off another build and am watching the results closely. Will post an update shortly.

Results from the most recent build; Two more timeouts:

https://github.com/mmoayyed/cas/runs/930558253?check_suite_focus=true

and

https://github.com/mmoayyed/cas/runs/930558290?check_suite_focus=true

I restarted the same job a few more times and it looks like at times it does pass altogether, but there are phantom failures here and there.

That said, there is something strange about how results are process. For example, the commit tied to this job finished around yesterday and yet codecov shows files are still in the queue to be processed:

Is this accurate?

I’m still seeing issues on 1.0.12.

Just now:

Hi @adamtheturtle, I just released v1.0.13. Can you let me know how that is?

I have a repository with >40 jobs in the main CI workflow (GitHub - VWS-Python/vws-python-mock: Mock of the Vuforia Web Services (VWS) API).
In this repository it was very rare to not see this error on at least one job per run.
In my branch bumping this action to v1.0.13 the run passed (Bump codecov and make CI fail if it fails by adamtheturtle · Pull Request #690 · VWS-Python/vws-python-mock · GitHub).
However, the code coverage is reported as 99.25%, even hours later, and the “missing” coverage is code which I am certain is covered.

For example, https://codecov.io/gh/VWS-Python/vws-python-mock/compare/355730c6297fc61e4cede0f9cdf5dc2e66918e06...143c1ffd35af34b7c14fc3817ddf1e25f31b82f3/src/tests/mock_vws/test_query.py.
This shows that test_query::TestAcceptHeader::test_valid is not covered at all.
However, I can see at Bump codecov and make CI fail if it fails by adamtheturtle · Pull Request #690 · VWS-Python/vws-python-mock · GitHub that the tests ran and passed.
It is as if the coverage report from https://pipelines.actions.githubusercontent.com/24aMLzmFiVrnOAZbDJVsjvzPJ0pTXAUSmss8BIAcQXv2PpCi9Y/_apis/pipelines/1/runs/518/signedlogcontent/18?urlExpires=2020-08-18T12%3A37%3A30.7865207Z&urlSigningMethod=HMACV1&urlSignature=Fu%2FFZ9GTOitbaxKxsPzOvXloNsadurjGE9Qoi68eLHI%3D is not considered.

I can create a different issue for this if appropriate - I would appreciate some advice on this.

Hi @adamtheturtle, if you don’t mind, would you be able to open a new issue for this?

@tom Sure - I created Coverage not accurately reported.

1 Like