indexer – Cron ‘catalogrule_apply_all’ locks cron ‘indexer_all_invalid’ and catalogsearch_fulltext keeps processing Magento 2.3.5

Took me quite some time to debug this. Every night (default) at 0:00 catalogrule_apply_all is run by cron, and resets all indexes for all store.
As a consequence catalogserach_fulltext is launched (takes about 8 minutes in my store).

In the meantime, new crons run, and logs begin:

main.WARNING: Could not acquire lock for cron job: indexer_reindex_all_invalid () ()

At first five items, but after a day, it’s about 60 of these, every minute. And catalogsearch_fulltext is then processing constantly. If it finished, another reindex is started again. Probably, because the all invalid index doesn’t work anymore. indexer_reindex_all_invalid does not come back in action again.

The system does not unlock indexer_reindex_all_invalid once this error occurs.

If I do a grep I then see that the cronjob cron:run seems to hang and for about 20 minutes I have cron.php called at least twice.

(USER)+ 11321  0.0  0.0   4340   724 ?        Ss   10:09   0:00 /bin/sh -c /usr/bin/php /(PATH)/public_html/bin/magento cron:run 2>&1 | grep -v "Ran jobs by schedule" >> /(PATH)/public_html/var/log/magento.cron.log

(USER)+ 11325 60.2  4.8 590908 197924 ?       R    10:09  10:14 /usr/bin/php /(PATH)/public_html/bin/magento cron:run

(USER)+ 12542  0.0  0.0   4340   772 ?        Ss   10:26   0:00 /bin/sh -c /usr/bin/php /(PATH)/public_html/bin/magento setup:cron:run >> /(PATH)/public_html/var/log/setup.cron.log

(USER)+ 12544 57.0  2.1 484272 88712 ?        R    10:26   0:01 /usr/bin/php /(PATH)/public_html/bin/magento setup:cron:run

All crons work perfect, normally – no single cron missed in cron_schedule.
I can reset the indexing by truncating all _cl tables, resetting mview_state version_id, truncating cron_schedule. Manual reindex and flush cache.

Everything works perfectly (also indexer_reindex_all_invalid) untill the next time ‘catalogrule_apply_all is launched. I tested this by adding this manually to cron_schedule.

But now? I don’t know what to do anymore, hopefully you have an answer.

The error is thrown by /vendor/magento/module-cron/Observer/ProcessCronQueueObserver.php
I wouldn’t know if this needs updating or adjustment.

Additional info

  • I do not have any catalog rules. Tried making one and deleting again, no difference.
  • I’m using Elasticsearch plugin by Mirasvit, Magento 2.3.5 on a cloudways server. Using Redis for sessions. (don’t think it is Mirasvit related).
  • In env.php I have ‘lock’ set to ‘db’, I think just default settings.
  • I’ve set all cronjobs in config->system for seperate process to ‘no’.
  • Migrated site from 1.9.x to 2.3.5

It is just the catalogrule_apply all that is messing up things. Does anyone have a suggestion?

post processing – Difference between DxO’s Nik Collection 3 and DxO PhotoLab 3?

Can someone tell me the difference between DxO’s Nik Collection 3 and DxO PhotoLab 3? They seem so similar but I can’t find any comparison to help me decide.

(If this is somehow the wrong sort of question to post here, please accept my apologies and help me find the appropriate forum. Thanks)

Can I sue for mortgage processing delay?

I applied for a mortgage refinance. Was promised 6 weeks. After 4 frustrating months of no action and no decision, I withdrew the application and had to settle for a higher rate with another lender. Because the rates had changed. Can I sue for the difference in rates – basically the extra amount I will pay over the next 30 years, discounted to today. 

stream processing – How to keep a data warehouse updated?

Suppose there is a system ( like an ERP ) that writes to a database ( not too big, less than 100GB ). You need to export the data from this database to a data warehouse ( like RedShift or BigQuery ) as many times in a day as you can, what would be a good solution for that?
There is this feature in the system that exports only the delta, so this is what I was thinking:

1 – Write an ETL script to query the delta, format in Avro and save it in a bucket ( GCS or S3 )
2 – Trigger a function when the object is inserted, get the object and insert into a staging table ( one for each table in the origin DB )
3 – Trigger a function to merge the staging table into the main table

I’m not too happy with this approach, because it feels so limited. I think I’m missing something here. Should data in a DW be so hard to maintain? I see a lot of examples on how you can insert data into a DW, but very few on how to keep it updated.

Also, suppose that this delta mechanism didn’t exist and we had to use a streaming solution ( like Kinesis ). That would make things even harder, because data will be inputed into the bucket much faster, generating lots of files, so how could I handle a scenario like this given that DW are slow to update row by row ( BigQuery even limits the amount of updates/day )?

magento2 – There has been an error processing you request

It seems your hosting provider has disabled php exec(). You need to ask your hosting provider to enable it.

Basically the error is coming from Shell.php file residing in /lib/internal/Magento/Framework. See the code below

$disabled = explode(',', ini_get('disable_functions'));
if (in_array('exec', $disabled)) {
    throw new Exception("exec function is disabled.");
exec($command, $output, $exitCode);


Hope it helped you


natural language processing – More Information about these Question Answering Systems

My question is about the multi-hop question answering systems described in the page and reproduced in the following picture:

enter image description here

I am interested in the models BFR-Graph, GSAN-large, ETC-large and AMGN.
I think the previous capital latters are the initials of the name of the models in the original paper but google does not retrieve my desired results.

Thanks in advance for any information about the four models cited above.

package management – Errors were encountered while processing: /var/cache/apt/archives/base-files_10.1ubuntu2.8_amd64.deb

Hope you all are doing well.

I was trying to install Chromium browser on Ubuntu, when I came across this issue:

root@desktop:~$ sudo apt install chromium-browser
Reading package lists... Done
Building dependency tree       
Reading state information... Done
The following additional packages will be installed:
  base-files chromium-browser-l10n chromium-codecs-ffmpeg-extra
Suggested packages:
  webaccounts-chromium-extension unity-chromium-extension adobe-flashplugin
The following NEW packages will be installed:
  chromium-browser chromium-browser-l10n chromium-codecs-ffmpeg-extra
The following packages will be upgraded:
1 upgraded, 3 newly installed, 0 to remove and 308 not upgraded.
1 not fully installed or removed.
Need to get 0 B/71.3 MB of archives.
After this operation, 243 MB of additional disk space will be used.
Do you want to continue? (Y/n) y
Reading changelogs... Done
dpkg: warning: files list file for package 'libc-bin' missing; assuming package has no files currently installed
(Reading database ... 200414 files and directories currently installed.)
Preparing to unpack .../base-files_10.1ubuntu2.8_amd64.deb ...
dpkg (subprocess): unable to execute old base-files package pre-removal script (/var/lib/dpkg/info/base-files.prerm): Permission denied
dpkg: warning: old base-files package pre-removal script subprocess returned error exit status 2
dpkg: trying script from the new package instead ...
dpkg (subprocess): unable to execute new base-files package pre-removal script (/var/lib/dpkg/ Permission denied
dpkg: error processing archive /var/cache/apt/archives/base-files_10.1ubuntu2.8_amd64.deb (--unpack):
 new base-files package pre-removal script subprocess returned error exit status 2
dpkg (subprocess): unable to execute installed base-files package post-installation script (/var/lib/dpkg/info/base-files.postinst): Permission denied
dpkg: error while cleaning up:
 installed base-files package post-installation script subprocess returned error exit status 2
Errors were encountered while processing:
E: Sub-process /usr/bin/dpkg returned an error code (1)

What should I try? force install returned same error, and is it okay to remove and reinstall base-files ( because I was thinking of doing that, but there’s a warning that it’s an essential and should not be removed).

Suggestions? If possible I want to avoid re-installing Ubuntu, as there’s about 12 user accounts here with saved files.

image processing – 3D ImageCooccurrence

I’d like to compute ImageCooccurence for an Image3D object. Mathematica advertises that “Now essentially any operation possible for 2D images also works for 3D images.”

I’ve defined a 3D Kernel and a 3D image as such:

kernel3d = {
  {{0, 0, 0}, {0, 1, 0}, {0, 0, 0}},
  {{0, 1, 0}, {1, 0, 1}, {0, 1, 0}},
  {{0, 0, 0}, {0, 1, 0}, {0, 0, 0}}};
image3d = Image3D[RandomReal[1, {5, 5, 5}]];

When I run ImageCooccurrence[image3d,10,kernel3d], Mathematica just reports that the the input should be an Image.

Is there any ImageCooccurrence extension for Image3D? If not, does anyone have any suggestions for quickly accomplishing this?

post processing – Why does darktable’s shadows and highlights module invert this image?

I’ve recorded a screencast of the issue here.

I have a test image that I made so I could see how various modules in Darktable effected the histogram:

enter image description here

I import this image in darkable and switch on shadows and highlights, this gives me a result like the below:

enter image description here

(not much has changed, but the bottom stripes are slightly darker)

If I then increase the exposure, everything works as I would expect (as in, the image gets lighter), until I move over about +1EV, which is when the background of the image starts to darken. By +2EV, the image looks like this:

enter image description here

If I switch off the shadows and highlights module – the image returns to what I would expect.

My question is:

  • Why does increasing the exposure of this image make it darker?

…and I wouldn’t mind some context around

  • Is it possible to switch on ‘shadows and highlights’ in a neutal state so that it doesn’t change the image (like switching on crop or exposure)
  • Is there a repository anywhere of more professionally generated ‘see what the histogram looks like’ images?

post processing – How do I shift a series of time lapsed photos (a week between each photo) to match hardpoints?

I have collected a series of photos approximately 1 week apart using a drone. Thus, the photos are slightly different. I want to transform each photo using a series of hard points so that those points align. I don’t want to merge the photos, just align all of the photos so that they don’t “jump” around when I toggle through them. I assume I choose a base photo that I’ll match the others to and then crop them all to have the same frame size.