Distributed Caching
|
Showing posts with label Distributed Cache. Show all posts
Showing posts with label Distributed Cache. Show all posts
Sunday, October 20, 2013
Monday, June 24, 2013
Advantages of Distributed Caching
In general Caching improves performance of an application by storing frequently used information in the cache memory which is easily accessible when compared to the standard memory. Simple caching techniques can be used for smaller applications which get executed in a single server, however when it comes to larger applications which cannot run in a single server the simple caching techniques are not sufficient enough.
Larger applications required multiple servers, running in parallel to handle the huge volume of requests, hence we need Distributed Caching for large applications. The following are the advantages of using Distributed Caching.
Larger applications required multiple servers, running in parallel to handle the huge volume of requests, hence we need Distributed Caching for large applications. The following are the advantages of using Distributed Caching.
Distributed Cache
Data is an important part of any application, in
a conventions web application architecture which does not use any cache, data
can be stored in 2 ways, using the database or using sessions, but both these
options have their own limitations.
The database can store any amount of data, but fetching large amount of data from the database and displaying in the UI will bring performance of the application. Sessions are quick but there is a limitation on the amount of data that can be stored in session, adding more data to the session, consumes more memory and hence brings down the application performance.
The database can store any amount of data, but fetching large amount of data from the database and displaying in the UI will bring performance of the application. Sessions are quick but there is a limitation on the amount of data that can be stored in session, adding more data to the session, consumes more memory and hence brings down the application performance.
Thursday, August 23, 2012
SharePoint 2013 Distributed Cache Service
SharePoint
2013 provides a distributed cache service which stores the user’s login tokens
in a centralized cache, and can be used by any of the front end web servers in
a load balancing environment.
In SharePoint 2010, the login token is stored in the specific front end servers, in a load balancing setup if the user happens to hit a different front end server for subsequent requests then the user needs to be re-authenticated since the token for the previous request is stored in a different front end server, this caused delays and a need for re-authentication whenever the uses request hits a different front end server, behind the load balancer.
The Server running the Distributed Cache service is known as a cache host, we can have multiple servers running the Distributed Cache, these individual servers form a cache cluster, the cache cluster acts as a single unit, the total cache size is the sum of the memory allocated to the Distributed Cache service on each of the cache hosts.
The Distributed Cache service also helps in enabling micro blogs and feeds across all users. The Distributed Cache service caches the blog messages and comments and enables faster sharing of information among the users.
Related Post
SharePoint 2013 Hardware and Software Requirements
SharePoint 2013 Authentication
SharePoint 2013 Sites
SharePoint 2013 Communities
SharePoint 2013 Content
SharePoint 2013 Search
SharePoint 2013 Insight
SharePoint 2013 Composites
SharePoint 2013 Claim Based Authentication
SharePoint 2013 Support for OAuth 2.0
SharePoint 2013 Distributed Cache Service
In SharePoint 2010, the login token is stored in the specific front end servers, in a load balancing setup if the user happens to hit a different front end server for subsequent requests then the user needs to be re-authenticated since the token for the previous request is stored in a different front end server, this caused delays and a need for re-authentication whenever the uses request hits a different front end server, behind the load balancer.
The Server running the Distributed Cache service is known as a cache host, we can have multiple servers running the Distributed Cache, these individual servers form a cache cluster, the cache cluster acts as a single unit, the total cache size is the sum of the memory allocated to the Distributed Cache service on each of the cache hosts.
The Distributed Cache service also helps in enabling micro blogs and feeds across all users. The Distributed Cache service caches the blog messages and comments and enables faster sharing of information among the users.
Related Post
SharePoint 2013 Hardware and Software Requirements
SharePoint 2013 Authentication
SharePoint 2013 Sites
SharePoint 2013 Communities
SharePoint 2013 Content
SharePoint 2013 Search
SharePoint 2013 Insight
SharePoint 2013 Composites
SharePoint 2013 Claim Based Authentication
SharePoint 2013 Support for OAuth 2.0
SharePoint 2013 Distributed Cache Service
Subscribe to:
Posts (Atom)