Codecov stuck in "Processing..." for commits of a public GitHub repository

Description

If you open the commits page of our Codecov project, you’ll see that

  1. Some commits (4-6 days old) are stuck in the “Processing…” state.
  2. The list of commits is not up-to-date, as it misses the latest GitHub commits.

By clicking on the “Processing…” button, for example this link, Codecov reports “Unable to find report content in the storage archive.”, but those commits exist on GitHub.

Repository

CI/CD

GitHub action

Uploader

See here

Commit SHAs

  • a147c2fced8bf4b1b7e38fb5eb00e1827cff667e
  • d08afb6fc16f9c481de35ea99b2c458e914e00d5
  • e13cac0b6761a46e544e963784afc857d471b98a

Codecov YAML

None.

Codecov Output

See the “Upload coverage to Coveralls” step from here.

Steps to Reproduce

See the description.

Additional Information

None.

Hi @fpoli,

  1. Yes, we had an outage documented here. I believe this was a side effect during that time period.
  2. It looks like you have some commits in that time frame that do not run the coverage workflow. That’s probably why we didn’t get coverage reports.

I would recommend installing the GitHub Codecov app to help fix future problems

Hi @tom ,

We’re seeing a lot of commits stuck in “Processing…” in this repo too. You can see them in the commits page. This started from approximately the outage time and lasts several days without any sign of stopping. Is it for the same reason?

Hi @Moandor, this shouldn’t be related to that incident. I’m not quite sure why they are stuck in Processing, so I’ll bump up to the product team to get some resolution here. Thanks for bringing this up.

Hi @Moandor, looks like this is an issue on our side. It’s going to take some time to investigate and get a fix in. The main issue is that the repository is hitting a limit in number of uploads, so I would suggest, in the meantime if it’s possible, to add a step that uploads a few of the coverage reports to Codecov as a single step.

I see the same sort of issue for this repository. Is this possibly related to an outage or is there another underlying cause?

@mmoayyed, this is another underlying cause. I would suggest, if possible, uploading reports in the same Codecov step if this is a blocker or decreasing the number of builds. I apologize for the inconvenience here while we investigate a solution.

Thanks very much for the update. Please post back when you do find a solution.

I don’t quite follow your note on “uploading reports in the same Codecov step”. There are jobs that each test a particular area of the system via Github Actions, and each job posts the coverage results. How can I combine the step that runs the tests with the step that uploads the coverage results, as shown below?

      - name: Run Tests
        run: ./runtest.sh ${{ matrix.category }}
      - name: Upload coverage to Codecov
        uses: codecov/codecov-action@v1.0.13
        with:
          flags: ${{ matrix.category }}
          fail_ci_if_error: true

@mmoayyed, I have seen other users upload the tests results using actions/upload-artifcact and then adding a job at the end of the CI flow that depends on all previous tests runs to upload coverage to Codecov. You can see this here underneath continuous-integration.yml

Thank you for the reference and example. I’ll see if I can re-org the build accordingly.

On a related note, is there a way one I can stop the processing of certain builds/commits on codecov? Seems like these jobs are suck, and would be good to have a way to cancel them.

Per the example, I modified my test job to upload coverage files when it’s done:

      - name: "Upload coverage file"
        uses: "actions/upload-artifact@v2"
        with:
          name: "cas-tests-${{ matrix.category }}.coverage"
          path: "./build/reports/jacoco/jacocoRootReport/jacocoRootReport.xml"

The output shows that the report(s) are uploaded successfully:

With the provided path, there will be 1 file(s) uploaded
Total size of all the files uploaded is 634992 bytes
Finished uploading artifact cas-tests-jmx.coverage. Reported size is 634992 bytes. There were 0 items that failed to upload
Artifact cas-tests-jmx.coverage has been successfully uploaded!

Then the reports are downloaded and uploaded to codecov:

      - name: "Download coverage files"
        uses: "actions/download-artifact@v2"
        with:
          path: "reports"
      - name: "Display structure of downloaded files"
        run: ls -R
        working-directory: reports

      - name: "Upload to Codecov"
        uses: "codecov/codecov-action@v1"
        with:
          directory: reports

Logs show that files were uploaded successfully:

cas-tests-jmx.coverage
cas-tests-spnego.coverage

./cas-tests-jmx.coverage:
jacocoRootReport.xml

./cas-tests-spnego.coverage:
jacocoRootReport.xml


==> Reading reports
    + reports/cas-tests-jmx.coverage/jacocoRootReport.xml bytes=9008047
    + reports/cas-tests-spnego.coverage/jacocoRootReport.xml bytes=9008488
==> Appending adjustments
    docs.codecov.io/docs/fixing-reports
    -> No adjustments found
==> Gzipping contents
==> Uploading reports

 Uploading to
storage.googleapis.com/codecov/...


    -> View reports at codecov.io/github/mmoayyed/cas/commit/13d96f456d62932f3cb2ab9f43fa95fa21fd0ce4

If I try to actually see the report at the given link, I see: " There was an error processing coverage reports." There is no other information on that page and it’s unclear what the error in fact is.

What’s the best way to proceed from here?

Link to the CI:

Link to the report:

I think you might have to do a checkout step in the GitHub job in order for us to pull the network.

Thank you, Tom.

I added a checkout step right before where coverage files are downloaded. I am seeing the same error unfortunately:

It’s unclear exactly what the root cause is. All that is reported back is There was an error processing coverage reports.

This is the workflow that ran:

This is the job that uploads files to codecov:

Is this an issue with directory structures and the fact that reports are uploaded from a subdirectory reports? How could I diagnose this?

Thanks much for your help.

@tom apologies for pining your directly. Is there something else I can do to help diagnose this? I was told that perhaps some rate limit is being exceeded. Is something I can help setup/bypass on our end?

@mmoayyed, apologies for the delay here. I noticed that this commit has coverage now. I wonder if this line has anything to do with it.

Hi @tom, thank you very much for getting back to me. I have been playing around with the coverage ops and decided to add support for Codacy and SonarCloud to see if I can reproduce the same error with those systems. The commit that you refer to are those that provide support for those two systems, and so far, I can see that coverage is reported to both Codacy and SonarCloud. CodeCov remains unaffected.

The reason coverage is reported on CodeCov for that particular commit, I suspect, is that I took out a number of test categories while I was testing Codacy and SonarCloud related configs. The project has split tests across many modules in distinct categories, and these are collected individually as jacoco xml files, uploaded in one job and downloaded in another as you suggested for the codecov action to report them. What I noticed was that I take out all the test categories and only leave a few, thereby reducing the number of uploaded jacoco xml files, CodeCov seems to be able to receive and process coverage reports. That’s likely why you see coverage reported for that commit.

If it helps with diagnostics,

  • Codacy and SonarCloud both report coverage results without issues, which I think rules out “rate-limiting” issues. They would have the same problem, right?

  • This issue only started to happen 1-2 weeks ago, with no apparent change on the project’s end.

@mmoayyed, it looks like the report you upload normally is half a gig large. That’s probably why it’s breaking on our end. Is it possible to call Codecov more times with smaller coverage payloads?

Hi @tom, thanks for the diagnosis. I am not sure how to achieve what you suggest. Each individual report is uploaded. by the codecov action from a reports directory. Seems like it will upload everything that it finds. Each jacoco file is sized at about 8.6MB, times the number of files uploaded.

Are you saying I should let the action upload each file individually, rather than picking everything from a directory? or did you have something else in mind?

@mmoayyed, that is what I had in mind. This would, of course, mean calling the action a few times. Is that possible for you?

@tom, I’ll give it a try, sure.

What think should work is, call the action for every job that runs and every test category, to then pass along the jacoco xml file in the configuration of the action for that run and category.

What I am not sure about, and hopefully you can confirm, is whether there is a need to signal back to codecov that the upload process has completed. So something like, “run 50 test categories in a matrix, upload 50 files one at a time. Then when it’s all done, let codecov know it’s in fact done”. Is that necessary? How’s “done” determined by codecov for it to begin processing reports?