c# – como diminuir o tempo de resposta de uma Web API?

Sou iniciante em C# e estou desenvolvendo um CRUD utilizando Angular e Web Api (Entity Framework Core), porém o tempo de resposta para listar os dados do meu ServersController na minha tabela é muito alto. O que acontece atualmente é que ele demora muito para listar apenas 48 registros (mais de 5 minutos ou as vezes nem carrega), conversando com um colega me foi proposto fazer a consulta no banco direto no meu método Get e consumir a consulta por ali mesmo, a grande questão é: como eu posso consumir esta consulta? pois a conexão/instância do banco eu consegui fazer. Segue o código atual abaixo:

  public async Task<ActionResult<IEnumerable<Server>>> GetServer()
     using (var context = new ITControlContext())
        using (var command = context.Database.GetDbConnection().CreateCommand())
            command.CommandText = "SELECT e.Name (Environment), t.Name (ServerType), p.Name (Plant) FROM (ITControl).(dbo).(Server) s INNER JOIN (ITControl).(dbo).(Environment) e on s.IdEnvironment = e.IdEnvironment INNER JOIN (ITControl).(dbo).(ServerType) t on s.IdServerType = t.IdServerType INNER JOIN (ITControl).(dbo).(Plant) p on s.IdPlant = p.IdPlant";
            System.Data.DataTable dataTable = new System.Data.DataTable();

       return await _context.Server.Include(i => i.IdEnvironmentNavigation).Include(i => i.IdServerTypeNavigation).Include(i => i.IdPlantNavigation).ToListAsync();

O que eu tentei fazer foi criar uma “nova instancia” do meu banco e fazer a consulta lá mesmo (eu não sei muito o que eu fiz) e gostaria de retornar os dados dessa consulta para poder listar eles na minha tabela sem demorar tanto. Alguém tem alguma ideia do que posso fazer ou me dar algum Norte do que pesquisar/estudar?

Ps: eu imagino que eu tenha que alterar algo no meu return

$1 web Hosting + 50% Discount | Free SSL’s – Prewebhost! – Hosting, VPN, Proxies

Prewebhost.com is providing economical, reliable, affordable and high quality $1 web Hosting services since 2010. We offer a variety of hosting services, which range from the free & inexpensive individual hosting package to top quality services of budget and Business web hosting. Our Linux SSD hosting is having very perfectly balanced hosting plans and can help to get best results.

We offer free setup, unlimited bandwidth, free ssl, lets encrypt ssl, 24×7 live chat support, softacolous single click script installer and much more with our service. Kindly try to use our servivce where you can get 50% discount for one time payment, so try once our highly demanded $1 web Hosting:

Promo Code : PRE50

Plans of $1 web Hosting :

SSD_EXITE : $1 / Month
– 3GB SSD Web Space
– Double domain hosting
– Softacolous single click script library
– Incremental backups
– Private nameservers
– Single addon domain

SSD_EXTRA : $4 / Month
– 6GB SSD Web Space
– Ten domain hosting
– Softacolous single click script library
– Incremental backups
– Private nameservers
– Single addon domain

SSD_EXPERT : $6 / Month
– 25GB SSD Web Space
– 25 domain hosting
– Softacolous single click script library
– Incremental backups
– Private nameservers
– Single addon domain

SSD_EXTREME : $8 / Month
– 60GB SSD Web Space
– 100 domain hosting
– Softacolous single click script library
– Incremental backups
– Private nameservers
– Single addon domain

Special Features of ssd hosting :

Unlimited Sub Domains
Unlimited Parked Domains
Free Web Mail
Unlimited E-mail Forwarders
Unlimited Auto Responders
Free Softaculous with over 230 Apps
Free ImageMagick
Free PHP5.2 x Support
Free PEAR Support
Free Zend Optimizer
Free Ioncube Loader
Free RVSiteBuilder
Free Custom Error Pages
Free IP Deny Manager
Free Hotlink Protection
Free SSL

If you are not sure which hosting option best suits you our experienced and qualified sales staff will assist you with your wants. Prewebhost also provides further offers and the expertise for seamless upgrades as your business grows up. Keep in mind that the major aim of our company isn’t just to sell you our product, our chief aim is customer satisfaction.


Thank you.


migration – Does having your domain and web host at the same company reduce DNS propagation times?

Propagation times are controlled by the time to live (TTL) that is set on the DNS record. The only way to get DNS changes live faster is to have a shorter TTL specified.

GoDaddy may be offering to set a short TTL for you, but there is no reason that a short TTL can’t be specified at most DNS hosts.

Default TTL values vary widely between DNS hosts. Some may use 30 minute TTLs, but most set 12 hour, 24 hour, 48 hour, or even 36 hour TTLs.

Short TTLs may not even be honored by many caching DNS servers and they may cache records longer than indicated when they deem TTL values to small. Regardless of what your TTL is set to, you should plan to have you old server running for three days after you switch the DNS records. 95% of your traffic should move over within the TTL expiry, but a small amount may hit the old server for days.

web application – CDN vs server-level GeoIP

Suppose a website is set up so that only IPs from country X are allowed read/write access. This server goes through a CDN such as Cloudflare.

Because the GeoIP read block is at server level, it never actually works because Cloudflare will always cache from an IP in country X and broadcast worldwide so anyone outside of country X will still be able to read the version that was cached by Cloudflare locally.

What vulnerabilities remain open with this setup on the write side?

web part – Managed Property not appearing in webpart

I have created a list in a SharePoint Communication site. The list was populated and has several columns to describe the information. The columns appear as crawled properties (ows_XXXXX). What I did was go into Site Settings > Site Collection Administration > Search Schema and mapped the crawled properties to RefinableStringXX. I have reindexed the site as well as reindexed the list. However, when I use SPfx Modern Search Webpart, I cannot see the RefinableStrings. If I use a standard webpart, I cannot see them either. I have also manually typed the name of the RefinableString into the webpart and nothing shows up. It’s been over a week since I have done the reindexing. Is there something I am missing?

cookies – Is it necessary to encrypt a JSON Web Token more than what is built-in?

It depends on the application and infrastructure the application is running on.

The benefit of JWT is that it allows session-less continuous authentication through digital signing (public/private key scheme) and the token can be passed between hosts on behalf of the client. The hosts can authenticate the user using the public key in their possession. They know that the request is valid and from an authenticated source without maintaining any session or handshake operation (except regular SSL securing the transmission). Cookies are originally designed as session based tools. JWT is it’s sesionless (and importantly digitally signed) version.

The concern you mentioned is a real one. If the target end point using this JWT does not validate the signature, then a Man-In-the-Middle (MITM) can modify the content of the token. MITM can be an malicious actor on a host participating in the application (which received and passes tokens around as part of the application – they can have visibility into token b/c the host is end point of an SSL connection). It can also be any actor along the public internet (by replacing the SSL in the middle, they can have visibility but requires a valid CA root cert in client and server – hard to accomplish – think of state actors). The former is more real threat than the latter.

What can they accomplish by modifying the content? It depends on the application. If the content include an email address and the downstream apps act based on the email address. Then, malicious actor can replace ordinary user email with an admin email address and convince the app that it is coming from an admin. This all depends on how the application works and how much the malicious actor could figure out the internals of the downstream app.

They can also steal the token from from app and use it at another by replaying with new content suitable for that app.

Combine this with tokens set to live long (like mobile app tokens). The malicious actor could have good run with it for a good while.

Another risk is stealing data. This is not a problem if you’re not putting anything inside JWT and use it purely for authentication but people can start using it like a cookie and pass things through JWT instead of updating their API for some backwards compatibility issue etc.

A side benefit is that encrypting the content allows you to invalidate the tokens for a specific app without invalidating tokens for all the apps sharing the same identity provider. Once you replace the certificate for an app, all existing tokens becomes invalid but only for that specific app (a major drawback of JWT with long living tokens). This could be desired when using 3rd party identity providers. It also allows different certificate key rotating schedules be defined for different apps. Plus, the keys could be per user/per department/etc., thus allows invalidating JWTs in a more granular fashion.

So, it is understandable from these perspectives if a large corporate like a bank would like make sure the content is encrypted to make it tamper-safe, unreadable and easily invalidatable etc.

postgresql – Caching expensive SQL reports in web app when the data changes

I have a web application with an express/node backend using typeorm and PostgreSQL. The home page in my app is a query with lots of inner joins that shows the user a complex report. This query takes about 30 seconds to run which is a bad experience.

I could easily add caching with a ttl value, but that has 2 problems. First the report could be out of date if the user hits the cache after updating data. Second the first page load after the ttl expires will be slow.

Since the report only changes when more records are added to the database I could use the number of records as a key to tell me whether the cache value is out of date or not, solving the first problem. And then I could have a queued process that updates the cache in the background any time the number of records in the database changes, solving the second problem.

The only thing is, I don’t know if any third party libraries exist that already do this or if I’m somehow reinventing existing functionality. Does this strategy have a name?

dnd 5e – Does casting Web interrupt the spell Sanctuary?

The rules say

If the warded creature (…) casts a spell that affects an enemy creature, (…) this spell ends.

Nothing specifies that the spell must affect creatures at the moment it is cast. So, you cast Web on an area with an enemy Orc in there. The Orc wasn’t affected yet (it’s not its turn), so Sanctuary is up. Then a few turns go by during which your allies or other enemies move and act, and Sanctuary is up. Then it’s the Orc’s turn, at the beginning of which he must make a save against the Web spell, and Sanctuary ends at that point.

The idea of Sanctuary (it’s even been errata’d to better convey it) is to ensure that you are a sort-of peacekeeper. Enemies have trouble attacking you, but you also cannot be harmful to them.

architecture – Where should I place more complex business logic related to what a Web API does but consumed by a single client?

I have been debating with my colleagues about the following and we still did not reach consensus.

The architecture is as follows:

  • medium size monolith application for intranet usage

  • a small application that features a SPA and a Web API which allows some folks to fill in some tests (Internet)

  • both applications use the same database and our team handles both

  • the flow is the following: someone initializes a test in the internal app and it becomes available in the external app, the external app deals with issuing the test and computing the score which becomes available for the internal app.

Now, the request is to allow the internal app user to preview the test. This comes the debate related to where to place the business logic: in the monolith vs. in the Web API.

Arguments for the monolith

  • knows best what to display and that is why it should request lists of DTOs (sections, questions, possible answers etc.), aggregate and sort + display the data
  • Web API should behave as “REST as possible” for external clients and only provide simple endpoints (e.g. get a list of entities and children in this case).

Arguments for the Web API

  • Web API already handles the test display, so it already has the data models and business logic to display them to the user. Aggregation, sorting and caching is already performed there and can be almost entirely reused to construct monolith specific required DTOs. Shortly put, the tests domain is in the Web API, so any business logic using those entities should be placed there unless there is a strong argument to do otherwise

  • stop developing in a monolith and develop in a lightweight service instead

  • avoid duplicating almost the same business logic in two places

I am interested in what is the recommended way to proceed in such a case.

Question: Where should I place more complex business logic related to what a Web API does but consumed by a single client?

Extra details required through comments

There is no migration towards microservices and the moonlight will not be rewritten any time soon. However, some functionality such as the described Web API must be separate since it will be deployed differently (e.g. DMZ as opposed to the Intranet for the monolith).

Is Shared Web Hosting Secure & Reliable?

I have a dedicated server with Hostgator to host my company website and emails. The website is very simple and has almost no traffic, but my… | Read the rest of https://www.webhostingtalk.com/showthread.php?t=1817414&goto=newpost