[00:30:47] @aguycalled I started getting cfund errors [00:30:51] In my tests [00:31:48] But I did not change anything related to cfunds, and I even tried reverting to a commit that I knew for sure that worked. [00:31:53] Still getting the same issue [00:35:09] which cfund tests? [00:35:18] I think all of them? [00:35:25] Wait, let me do a sanity check [00:35:32] I'm building the master branch [00:35:41] And will run the suite on that [00:38:39] Looks like your build is getting the same errors [00:38:57] assert (get_bip9_status(node, "communityfund")["status"] == "started") [00:43:10] Yeah, looks like I'm not crazy after all, the error is persisting even when I built master branch [00:44:03] mmm [00:44:06] Could be a bug in the cfund code? [00:44:39] Something timestamp related maybe, that would be the only explaination for it to start to fail just today [00:44:45] can ypu tell me one concrete test you see failing? [00:44:59] ./qa/pull-tester/rpc-tests.py cfund-proposalvotelist [00:45:00] or is it all? [00:45:34] It's anyting that is calling ./qa/rpc-tests/test_framework/cfund_util.py [00:45:43] Basically the cfund fails [00:46:14] From the looks of it, seems like all tests that use activate_cfund are failing [00:46:36] And I'm testing with the latest master branch which I know was working just a few hours ago [00:46:50] this is weird, as it was working [00:46:51] yeah [00:46:59] Looks like your build on travis-ci also failed with same issue [00:47:37] Yeah, must be something that is timestamp related (Something that would cause the cfund to get rejected?) [00:47:50] yes [00:47:52] ok [00:48:19] // Deployment of Community Fund consensus.vDeployments[Consensus::DEPLOYMENT_COMMUNITYFUND].bit = 6; consensus.vDeployments[Consensus::DEPLOYMENT_COMMUNITYFUND].nStartTime = 1493424000; // May 1st, 2017 consensus.vDeployments[Consensus::DEPLOYMENT_COMMUNITYFUND].nTimeout = 1556668800; // May 1st, 2019 [00:48:22] bingo [00:48:28] nTimeout 😃 [00:49:08] ill move it to 1651363200 [00:49:11] 1/5/2022 [00:49:20] 😄 [00:49:21] Hahaha [00:49:23] chainparams.cpp [00:49:32] Will you merge that to maste now? [00:49:51] So I can rebase based on master? Or should I make the change to my branch as well? [00:50:18] yes ill do [00:50:23] Thanks [00:50:26] Wew [00:50:37] For a few minutes I thought I was going crazy [00:50:37] LOL [00:51:48] done [00:53:18] BTW, I finally figured out where the LOCK(cs_main) needed to be added [00:53:24] I'm doing final testing now [00:53:35] Then PR 450 should be ready for final review [00:53:36] 😄 [00:53:47] great work [00:53:56] 😉 [00:54:06] i'm happy you joined dev efforts! [01:07:00] @aguycalled [01:07:11] I think you might need to run certbot renew [01:07:13] On https://build.nav.community/depends-sources/ [01:07:31] Certificate just expired [01:08:03] Or if you have the certbot renew added to a crontab, will need to also nginx reload or apache2 reload [01:14:11] done thx [14:12:39] @mxaddict just for reference, the rpc test suite has a method to set a node time [14:12:53] set_node_times [14:13:12] i'm not sure how it will work with the ntp sync tho [19:51:33] @mxaddict could you also leave a navcoin address for the bounty payments in the issues you are assigned [20:30:46] @aguycalled I commented on the issues with a fresh address 😄 [20:31:31] thx, ill send the coins tomorrow 😉 [20:31:55] Thanks for the tip about set_node_times [20:32:26] I might be able to use that to test the getstakereport better [20:48:24] @aguycalled Is anyone working on the show mnemonic and import mnemonic UI? [20:48:31] If not, I might give it a shot. [20:50:37] nope, should i assign it to you? [20:56:07] Not yet, I'll have a look see first [20:56:18] To see if I can actually get it done [20:56:32] @aguycalled [21:43:06] I've approved and merged #459 and #450 [21:43:18] looking at #456 now [21:43:32] then will pull master into rc PR [21:44:58] ah #456 has conflicts with master now [21:46:17] i can resolve it, its just the maturity param added to the devnet chainparams [21:53:22] Just wondering what our procedure for an approved PR should be. E.g. often we (myself included) approve a PR with no comment. I think we should get into the habit of documenting what we did to approve it. Eg. did we eyeball the diff, did we code review the changes in context, did we compile successfully, did we run the relevant unit tests, did we manually test the patched scenario, did we smoke test other features to make sure normal [21:53:23] wallet functions remain working? etc.. [21:54:58] thoughts? [22:00:13] @prole Not sure, I don't have much experience with opensource projects But at my desk job we have a small check list 1. Eyeball the merge 2. Review indepth if the changes are more than 5 lines 3. Make sure all test cases pass (Sometimes we have manual tests) 4. Approve/Reject based on results from 1 - 3 [22:00:59] The manual tests are usually just for stuff that can't be automated like GUI or UX [22:43:06] @prole looks like the lastest build for PR#456 has an issue with cfund-paymentrequest-state-reorg.py