asynchronous processing – How to import definitions from front end into LocalSubmit

I am trying to use LocalSubmit to execute an intensive task in the background. How to distribute definitions to the underlying LocalSubmit kernel?

(*example *)
(*some function*)
r`func := 100;

(*some task using that function that saves the result in r`r*)
r`task := r`r = r`func

r`r = -1;
 HandlerFunctions -> <|"TaskFinished" -> MessageDialog, 
   "FailureOccurred" -> Print

(*r`r printed after the message popped up*)
(*-1*)(*---------->should be 100*)

entities – OO way of adding base field definitions

Without using any hook, it’s not possible, since to change the class used for an entity, it’s necessary to implement hook_entity_type_alter(), which is problematic to implement when more modules need to alter the entity definition.

Imagine two different modules that try to change the class used to implement the same entity.
For example, the first module could use the following code.

 * Implements hook_entity_type_alter().
function login_id_entity_type_alter(array &$entity_types) {
namespace Drupallogin_idEntity;

use DrupalCoreEntityEntityTypeInterface;
use DrupalCoreFieldBaseFieldDefinition;
use DrupaluserEntityUser;

class LoginIdUser extends User {

   * {@inheritdoc}
  public static function baseFieldDefinitions(EntityTypeInterface $entity_type) {
    /** @var DrupalCoreFieldBaseFieldDefinition() $fields */
    $fields = parent::baseFieldDefinitions();

    $fields('login_id') = BaseFieldDefinition::create('string')
      ->setLabel(t('Login ID'))
      ->setDescription(t('The ID used for the login credentials.'))
        'LoginId' => (),
        'LoginIdUnique' => (),


    return $fields;


The second module could use the following code.

 * Implements hook_entity_type_alter().
function admin_email_entity_type_alter(array &$entity_types) {
namespace Drupaladmin_emailEntity;

use DrupalCoreEntityEntityTypeInterface;
use DrupalCoreFieldBaseFieldDefinition;
use DrupaluserEntityUser;

class AdminEmailUser extends User {

   * {@inheritdoc}
  public static function baseFieldDefinitions(EntityTypeInterface $entity_type) {
    /** @var DrupalCoreFieldBaseFieldDefinition() $fields */
    $fields = parent::baseFieldDefinitions();

    $fields('admin_mail') = BaseFieldDefinition::create('email')
      ->setLabel(t('Admin email'))
      ->setDescription(t('The email used from administrator users to contact the user.'))

    return $fields;


With two modules implementing that code, only a field would be added to the entity. To get both the entity fields, LoginIdUser should extend AdminEmailUser, which would mean that a module should have the other one as dependency. (This assumes that admin_email_entity_type_alter() is executed before login_id_entity_type_alter(), which is what normally happens when the hook execution order isn’t altered by a hook_module_implements_alter() implementation, or when the weight associated to module isn’t altered by a call to module_set_weight().)

Implementing hook_entity_base_field_info() allows to modules to add different fields without interfering with each other.

As side note, hooks can be placed in a file different from a .module file, which is automatically loaded from Drupal. Which files are loaded from Drupal when looking for hooks is influenced by hook_hook_info(). With the implementations of that hook done from Drupal core modules, the files Drupal core looks for are the following.

  • <module_name>
  • <module_name>

If you are interested in reducing the size of the .module file, implementing hook_hook_info() to tell Drupal where your module hooks are could be a way to achieve it.

audit – Where to download OVAL definitions and baselines for OpenSCAP?

audit – Where to download OVAL definitions and baselines for OpenSCAP? – Information Security Stack Exchange

clear – All definitions are cleared when I reopen my file

I just started getting into mathematica and have the problem that whenever I save and reopen my .nb file, all functions and variables are cleared and I need to shift+enter every cell again to continue working on it. Is this normal, suggested to be like that or possible to turn off?

compilers – L-attributed definitions include all syntax-directed definitions based on LL(1) grammars

I was going through the text Compilers: Principles, Techniques and Tools by Ullman et. al where I came across the following claim:

L-attributed definitions include all syntax-directed definitions based on LL(1) grammars.

What is the intuition or logic behind this statement?

In the later sections they show that L attributed grammars can be easily implemented by a DFS search algorithm of the parse tree (and hence easily by a predictive parser (recursive/non recursive)). $^dagger$

Is it so that since LL(1) grammars can also be parsed by predictive parser and by the logic of $dagger$, the authors make the claim?

entities – How best to determine what differs between field schema and field definitions when “Mismatched entity and/or field definitions” is reported?

I’ve inherited a large, enterprise Drupal 8 site in which we have many “Mismatched entity and/or field definitions” warnings reported in the status report. The site is currently running on Drupal 8.9.13, and we’d like to get these issues addressed before upgrading to Drupal 9 (not least of which because they seem to cause problems after the upgrade when I attempt to run database updates).

I know these issues were introduced during – or at least before – the update from Drupal 8.6 to 8.7 – long before I was responsible for the site. I’m a little uncertain if database updates were skipped at some point – but those continue working during site updates to both modules and core. But “drush entup” back then was clearly not run while it was still available. (And I understand why that’s been deprecated for production sites since 8.7.)

Important points:

  • I’ve attempted both devel_entity_updates and the entity_update modules to see if these would help us in a development environment, but neither will perform the changes due to “The SQL storage cannot change the schema for an existing field with data.” So those modules have not been a solution for us unfortunately.
  • The site has a few alphas, a few betas, and a few RCs installed – not many, but a few. (More than I would have chosen to install for a production site.) In my update attempt to Drupal 9, many of these were uninstalled and pulled out. This did not make these mismatch messages go away.
  • I’m comfortable with massaging the underlying field tables directly in SQL: I did so in a case where I needed only to add two columns just so I could uninstall a module. If I must, I can use this brute-force method on other fields.
  • I’ve read through numerous Stack Exchange questions about this – including how to write update hooks to alter the schema of these fields, and I’m happy to do that if I can determine what actually differs.
  • I know how to roll back and re-run the DB updates for specific modules, but my attempts at doing so (for one or two modules that I thought might be the culprits) didn’t solve the problem. (Yet.) I may try more of this – going further back – if that seems a way to determine where these issues arose.

And that’s my main question:

What is the best way to determine what differs between the definitions and the schema for a given entity or field?

I know how to look at the field definitions themselves using a SQL command similar to this (on a node-based field in this example):

SELECT value FROM key_value WHERE collection='entity.storage_schema.sql' AND name='node.field_schema_data.field_name_of_my_field';

I can then take that value, make it readable (i.e. indent it), and then compare it to the actual field table in the database directly to see what might be different.

BUT – is this the best way to determine what differs? Am I barking up the wrong tree, or is there something else I’m missing? Or … is this in fact the best way to figure the differences out?

Many thanks for any insight that can be provided. I’d love to know if there’s an easier way before I launch myself down this path in full. (Dealing with troubled data is no fun.) 🙂

ag.algebraic geometry – Definitions of Picard schemes and Picard stacks

Let $X to S$ be a morphism of schemes. The Picard functor $mathop{mathrm{Pic}}_{X/S}$ is defined to be the associated sheaf of the absolute Picard functor $mathop{mathrm{Pic}}_X(T) = mathop{mathrm{Pic}}(X_T)$ in the fppf topology (or $mathop{mathrm{Pic}}(X_T) / mathop{mathrm{Pic}}(T)$). However, in the definition of Picard stack $mathop{mathscr{Pic}}_{X/S}$, see 0372, it neither takes the stackification nor take the quotient, and the presheaf associated to $mathop{mathscr{Pic}}_{X/S}$ is the absolute Picard functor $mathop{mathrm{Pic}}_X$, so I do not understand why $mathop{mathscr{Pic}}_{X/S}$ is a stack.

And under certain conditions, $mathop{mathscr{Pic}}_{X/S}$ is represented by an algebraic stack, see 0D04, and $mathop{mathrm{Pic}}_{X/S}$ is represented by a scheme, see section 9.4 of FGA explained. I wonder whether the algebraic stack is the same as the scheme.

Different definitions of equivalent norms

I’m trying to show that the following definitions are equivalent

Two norms over a field F are equivalent if:

  1. there exists two costants A,B such that A|x|_1<|x|_2<B|x|_1 for every x in F
  2. there exists c in Re such that |x|_1^c = |x|_2

certification – What are the (two?) definitions of SOC?

On one hand “Security Operations Centre”, but SOC is seemingly used in the reporting and certification domain, where does this come from? Is there another (or more) definitions of SOC in Information Security Compliance Certification, or do these terms just reference it as a way to describe practices needed to be in place in a given IT division? The question comes from trying to determine what SOC “Level” Report I should request from a third party vendor to my organization.

sharepoint online – Remove Role Definitions From SPO Group – Power Automate – MS Flow

I’m trying to remove full control role definitions from Site Owner Group, because I have added another permission level through Power Automate/MS Flow.

From Microsoft docs – Below is the syntax for POST Call


In Power Automate I tried like this

It says resource not found, I believe roledefinition is object not the roledefinition id but how can I send the object in Power Automate/Flow


DreamProxies - Cheapest USA Elite Private Proxies 100 Private Proxies 200 Private Proxies 400 Private Proxies 1000 Private Proxies 2000 Private Proxies - Buy Cheap Private Proxies Buy 50 Private Proxies Buy 100 Private Proxies Buy 200 Private Proxies Buy 500 Private Proxies Buy 1000 Private Proxies Buy 2000 Private Proxies ProxiesLive New Proxy Lists Every Day Proxies123