wireless networking – Windows 10 machine periodically shows Wi-Fi as “no internet available”, even though other devices work fine. How to debug?

I’m using a brand new Dell laptop with Windows 10, connected to a WiFi network. Periodically the machine indicates the Wifi connection as “no Internet available”, at which point I cannot even log into the router’s admin page. Reconnecting to the Wifi usually helps restore connectivity. All other devices are working fine.

How do I debug this issue to figure out what’s wrong?

machine learning – What n means in neural network equation?

I’ve found this equation that explains the output of a neuron in a MLP network:

$y(n) = f(mathbf{w}^T mathbf{x}(n) + b)$

I can understand the general context, but since i have no background with mathematical notation, i don’t understand what the $(n)$ parameter means (e.g. $y(n)$, $x(n)$). Is it sort of a temporal or sample index? I’ve seen this notation in other machine learning subjects, but didn’t find an explanation.

Thanks in advance,

machine learning – How to use classify to separate lines and circles

I generated lines and circles:

lines = Table[
    Line[{{RandomInteger[{0, 10}], 
       RandomInteger[{0, 10}]}, {RandomInteger[{0, 10}], 
       RandomInteger[{0, 10}]}}], ImageSize -> 10], {x, 1, 20}];
circles = 
    Circle[{RandomInteger[{0, 10}], RandomInteger[{0, 10}]}, 
     RandomInteger[{0, 20}]], ImageSize -> 10], {x, 1, 20}];

and put them into a classifier

c = Classify[{lines -> "lines", circles -> "circles"}]

the training was successful with no errors, but when trying to test the classifier with:

test = Graphics[Line[{{0, 1}, {0, 2}}], ImageSize -> 10]

I get the error:

ClassifierFunction::mlbddataev: The data being evaluated is not
formatted correctly.

And I do not understand what the problem is. Can somebody tell me, how to correctly format the data?

machine learning – How to get a heatmap of pixel influences from a Convolutional Neural Network

If I have a CNN model (let’s say I’m using Tensorflow and Keras) that I’ve trained for a particular task. For example, let’s say I’ve trained it to detect the difference between an apple and banana. For a given input image, is there a way to generate a heatmap overlaid on the original image that indicates which parts of that image influenced the network’s decision?

I’m guessing that you’d have to use the chain rule to get the partial derivative of the loss function with respect to each pixel, and the normalize those values and heatmap them. Is there a library or existing method that does this?

How would I do this if I had a CNN mapping to predictions for more than two classes?

macos – How to delete a file from Time Machine backups in Big Sur?

I can’t seem to take a screenshot of Time Machine, but I’ll describe it. I go to the Time Machine menu and select “Enter Time Machine”. Then I see the view with the stack of windows. I browse to a file and select it. Now I want to delete it so it doesn’t exist and take up space in backups. There is no option to delete it – either the action menu in the toolbar, or the control-click context menu.

How do I delete a file from Time Machine backups? There used to be a way to do this.

machine learning – Is deepfake detection viable?

I’m thinking of doing a project on deepfake detection, but I’m not entirely sure if it is viable. Based on my understanding, how it works is that deepfake generation programs have a generative and discriminative network, and eventually after training, the systems reaches an equilibrium where the discriminative network can’t detect real vs fake faces. I was thinking of building a CNN-LSTM architecture where I analyze not only single image frames, but image frames over time as well to better discern between real videos and deepfakes, but I’m not sure if this is viable? Any help or resources would be appreciated.

Complexity of of twin support vector machine to multi-label learning

I’m not familiar with computing the complexity of an algorithm and was hoping if you could help me find the complexity of the MLTSVM algorithm described in http://www.optimal-group.org/Resource/MLTSVM.html. I know that the Twin SVM complexity is $mathcal{O}(m^3/4)$ compared to $mathcal{O}(m^3)$ for SVM where $m$ is the number of samples of the dataset, however, I fail to determine the complexity for MLTSVM.

Thank you.

How can machine learning be used in making sure a document has all the required headings and details?

We are trying to build a system that would accept fyp proposal documents and then would validate is there something missing, like a heading or a chart that should be in the document according to the template.

The question is how machine learning can be used to solve this problem. As it seems a simple if else sort of a thing.

Need Advise Machine Learning Algorithm for product Recommendation

enter image description here

I wanna give recommendation to buyer which product is should be bought by price, sold, shipping price, and location. If my location is C (sorry, it should be written “C” not “B”) which product is optimal for me from price, sold, shipping price, and location.

Since I’m new in machine learning, which one of machine learning algorithm is good to implement?

Thank you.

lineageos – Is it possible to virtualize Android on an X86 machine?

Following this question, I had a little bit of success installing an X86 port of Android (i.e., BlissOS) on macOS with QEMU. Now I am wondering if I can virtualize a “normal” Android (preferably LineageOS) on any of the conventional FLOSS hypervisors such as VirtualBox, QEMU… (preferably something portable on Windows).

There are qemu-system-arm and qemu-system-aarch64 versions of QEMU that I expect to do the job. Over here, Alex Bennée is doing some magic with the so-called “ranchu kernel” that I can’t really understand and trust. Meanwhile, on this post it’s being said that the upstream QEMU hasn’t inherited the graphic acceleration back from Google’s Android Studio Emulator.

Now my questions are:

  1. what is that “ranchu kernel”? and how much it can be trusted?
  2. Can I virtualize LineageOS or any other well-known FLOSS Android? How?