If you open the commits page of our Codecov project, you’ll see that
Some commits (4-6 days old) are stuck in the “Processing…” state.
The list of commits is not up-to-date, as it misses the latest GitHub commits.
By clicking on the “Processing…” button, for example this link, Codecov reports “Unable to find report content in the storage archive.”, but those commits exist on GitHub.
We’re seeing a lot of commits stuck in “Processing…” in this repo too. You can see them in the commits page. This started from approximately the outage time and lasts several days without any sign of stopping. Is it for the same reason?
Hi @Moandor, this shouldn’t be related to that incident. I’m not quite sure why they are stuck in Processing, so I’ll bump up to the product team to get some resolution here. Thanks for bringing this up.
Hi @Moandor, looks like this is an issue on our side. It’s going to take some time to investigate and get a fix in. The main issue is that the repository is hitting a limit in number of uploads, so I would suggest, in the meantime if it’s possible, to add a step that uploads a few of the coverage reports to Codecov as a single step.
@mmoayyed, this is another underlying cause. I would suggest, if possible, uploading reports in the same Codecov step if this is a blocker or decreasing the number of builds. I apologize for the inconvenience here while we investigate a solution.
Thanks very much for the update. Please post back when you do find a solution.
I don’t quite follow your note on “uploading reports in the same Codecov step”. There are jobs that each test a particular area of the system via Github Actions, and each job posts the coverage results. How can I combine the step that runs the tests with the step that uploads the coverage results, as shown below?
@mmoayyed, I have seen other users upload the tests results using actions/upload-artifcact and then adding a job at the end of the CI flow that depends on all previous tests runs to upload coverage to Codecov. You can see this here underneath continuous-integration.yml
Thank you for the reference and example. I’ll see if I can re-org the build accordingly.
On a related note, is there a way one I can stop the processing of certain builds/commits on codecov? Seems like these jobs are suck, and would be good to have a way to cancel them.
The output shows that the report(s) are uploaded successfully:
With the provided path, there will be 1 file(s) uploaded
Total size of all the files uploaded is 634992 bytes
Finished uploading artifact cas-tests-jmx.coverage. Reported size is 634992 bytes. There were 0 items that failed to upload
Artifact cas-tests-jmx.coverage has been successfully uploaded!
Then the reports are downloaded and uploaded to codecov:
cas-tests-jmx.coverage
cas-tests-spnego.coverage
./cas-tests-jmx.coverage:
jacocoRootReport.xml
./cas-tests-spnego.coverage:
jacocoRootReport.xml
==> Reading reports
+ reports/cas-tests-jmx.coverage/jacocoRootReport.xml bytes=9008047
+ reports/cas-tests-spnego.coverage/jacocoRootReport.xml bytes=9008488
==> Appending adjustments
docs.codecov.io/docs/fixing-reports
-> No adjustments found
==> Gzipping contents
==> Uploading reports
Uploading to
storage.googleapis.com/codecov/...
-> View reports at codecov.io/github/mmoayyed/cas/commit/13d96f456d62932f3cb2ab9f43fa95fa21fd0ce4
If I try to actually see the report at the given link, I see: " There was an error processing coverage reports." There is no other information on that page and it’s unclear what the error in fact is.
@tom apologies for pining your directly. Is there something else I can do to help diagnose this? I was told that perhaps some rate limit is being exceeded. Is something I can help setup/bypass on our end?
Hi @tom, thank you very much for getting back to me. I have been playing around with the coverage ops and decided to add support for Codacy and SonarCloud to see if I can reproduce the same error with those systems. The commit that you refer to are those that provide support for those two systems, and so far, I can see that coverage is reported to both Codacy and SonarCloud. CodeCov remains unaffected.
The reason coverage is reported on CodeCov for that particular commit, I suspect, is that I took out a number of test categories while I was testing Codacy and SonarCloud related configs. The project has split tests across many modules in distinct categories, and these are collected individually as jacoco xml files, uploaded in one job and downloaded in another as you suggested for the codecov action to report them. What I noticed was that I take out all the test categories and only leave a few, thereby reducing the number of uploaded jacoco xml files, CodeCov seems to be able to receive and process coverage reports. That’s likely why you see coverage reported for that commit.
If it helps with diagnostics,
Codacy and SonarCloud both report coverage results without issues, which I think rules out “rate-limiting” issues. They would have the same problem, right?
This issue only started to happen 1-2 weeks ago, with no apparent change on the project’s end.
@mmoayyed, it looks like the report you upload normally is half a gig large. That’s probably why it’s breaking on our end. Is it possible to call Codecov more times with smaller coverage payloads?
Hi @tom, thanks for the diagnosis. I am not sure how to achieve what you suggest. Each individual report is uploaded. by the codecov action from a reports directory. Seems like it will upload everything that it finds. Each jacoco file is sized at about 8.6MB, times the number of files uploaded.
Are you saying I should let the action upload each file individually, rather than picking everything from a directory? or did you have something else in mind?
What think should work is, call the action for every job that runs and every test category, to then pass along the jacoco xml file in the configuration of the action for that run and category.
What I am not sure about, and hopefully you can confirm, is whether there is a need to signal back to codecov that the upload process has completed. So something like, “run 50 test categories in a matrix, upload 50 files one at a time. Then when it’s all done, let codecov know it’s in fact done”. Is that necessary? How’s “done” determined by codecov for it to begin processing reports?