-->
Web 1.0 was the first reiteration. Actually it was called ‘The Internet’ net version 1 or Web1.0. It was generally used before 1999 when experts called it the Read -Only era. The main features of ‘The Internet’ net Web 1.0 were hyper-linking and bookmarking of the web pages. It only consisted of online guestbook and framesets. There was no flow or communication between Consumer and the producer of the information. Also, the emails were sent through the HTML form. The best examples of ‘The Internet’ are static websites which were made during the ‘.com evolution’.
477Q. What is Web 2.0?
Web 2.0 was first introduced in the market by O’Reilly at the brainstorming discussion at media live International in 1999. The information available through Web 2.0 empowered the new generation to develop new concepts like Wiki, Widgets and Video streaming. It also allowed many users to publish their own content through few basic steps, which was not possible in the Web 1.0 or The Internet. Web 2.0 was responsible for the development of various sites that we commonly use today like Twitter, Flickr and Facebook.
Key features of Web 2.0 include
Web 2.0 and
philanthropy
Web 2.0 in social
work
Web-based
applications and desktops
First, techniques such as AJAX do not replace underlying protocols like HTTP, but add an additional layer of abstraction on top of them.
Second, many of the ideas of Web 2.0 were already featured in implementations on networked systems well before the term "Web 2.0" emerged.
Amazon.com, for instance, has allowed users to write reviews and consumer guides since its launch in 1995, in a form of self-publishing.
Amazon also opened its API to outside developers in 2002.
Previous developments also came from research in computer-supported collaborative learning and computer supported cooperative work (CSCW) and from established products like Lotus Notes and Lotus Domino, all phenomena that preceded Web 2.0.
"Nobody really knows what it means...If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along... Web 2.0, for some people, it means moving some of the thinking [to the] client side, so making it more immediate, but the idea of the Web as interaction between people is really what the Web is. That was what it was designed to be... a collaborative space where people can interact."
Other critics labeled Web 2.0 "a second bubble" (referring to the Dot-com bubble of circa 1995–2001), suggesting that too many Web 2.0 companies attempt to develop the same product with a lack of business models.
In terms of Web 2.0's social impact, critics such as Andrew Keen argue that Web 2.0 has created a cult of digital narcissism and amateurism, which undermines the notion of expertise by allowing anybody, anywhere to share and place undue value upon their own opinions about any subject and post any kind of content, regardless of their particular talents, knowledge, credentials, biases or possible hidden agendas.
Keen's 2007 book, Cult of the Amateur, argues that the core assumption of Web 2.0, that all opinions and user-generated content are equally valuable and relevant, is misguided.
Additionally, Sunday Times reviewer John Flintoff has characterized Web 2.0 as "creating an endless digital forest of mediocrity: uninformed political commentary, unseemly home videos, embarrassingly amateurish music, unreadable poems, essays and novels... [and that Wikipedia is full of] mistakes, half truths and misunderstandings".
Michael Gorman, former president of the American Library Association has been vocal about his opposition to Web 2.0 due to the lack of expertise that it outwardly claims, though he believes that there is hope for the future.
There is also a growing body of critique of Web 2.0 from the perspective of political economy.
As Tim O'Reilly and John Batelle put it, Web 2.0 is based on the "customers... building your business for you," critics have argued that sites such as Google, Facebook, YouTube, and Twitter are exploiting the "free labor" of user-created content.
Web 2.0 sites use Terms of Service agreements to claim perpetual licenses to user-generated content, and they use that content to create profiles of users to sell to marketers. This is part of increased surveillance of user activity happening within Web 2.0 sites.
Jonathan Zittrain of Harvard's Berkman Center for the Internet and Society argue that such data can be used by governments who want to monitor dissident citizens.
481Q. What is Supercomputer performance?
RAID is now used as an umbrella term for computer data storage schemes that can divide and replicate data among multiple physical drives: RAID is an example of storage virtualization and the array can be accessed by the operating system as one single drive.
The different schemes or architectures are named by the word RAID followed by a number (e.g. RAID 0, RAID 1). Each scheme provides a different balance between the key goals: reliability and availability, performance, and capacity. RAID levels greater than RAID 0 provide protection against unrecoverable (sector) read errors, as well as whole disk failure.
483Q. What Are the Characteristics Of Supercomputers?
484Q. what are the types of
computer
495Q. Software-based RAID
496Q. Non-RAID drive architectures
In physics, terahertz radiation refers to electromagnetic waves sent at frequencies in the terahertz range.
The term is normally used for the region of the electromagnetic spectrum between 300 gigahertz (3x1011 Hz) and 3 terahertz (3x1012 Hz), corresponding to the sub millimeter wavelength range between 1 millimeter (high-frequency edge of the microwave band) and 100 micrometer (long-wavelength edge of far-infrared light).
Like infrared radiation or microwaves, these waves usually travel in line of sight.
Q. & A.s ‑‑‑ SCI.
& TECH. – 18
476Q
What is web 1.0?
Web 1.0Web 1.0 was the first reiteration. Actually it was called ‘The Internet’ net version 1 or Web1.0. It was generally used before 1999 when experts called it the Read -Only era. The main features of ‘The Internet’ net Web 1.0 were hyper-linking and bookmarking of the web pages. It only consisted of online guestbook and framesets. There was no flow or communication between Consumer and the producer of the information. Also, the emails were sent through the HTML form. The best examples of ‘The Internet’ are static websites which were made during the ‘.com evolution’.
Incredibly, people are thinking this is the first big, huge, jump from
what we had - but guess what? It's not the first time.
Top ten things that changed long before anybody even knew of "Web 2.0":
1) We went from ARPANET to the Internet.
2) We went from bulletin boards and a protocol called "gopher" to webpages and http.
3) We started using Hypertext Markup Language.
4) We started using XML & CSS instead of plain HTML.
5) Development of TCP/IP.
6) DNS instead of plain IP addresses.
7) Unicode instead of plain DOS text.
8) Email.
9) Instant Messaging.
10) Wireless access.
Web 2.0 sites provide users with information storage, creation, and
dissemination capabilities that were not possible in the environment now known
as "Web 1.0".Top ten things that changed long before anybody even knew of "Web 2.0":
1) We went from ARPANET to the Internet.
2) We went from bulletin boards and a protocol called "gopher" to webpages and http.
3) We started using Hypertext Markup Language.
4) We started using XML & CSS instead of plain HTML.
5) Development of TCP/IP.
6) DNS instead of plain IP addresses.
7) Unicode instead of plain DOS text.
8) Email.
9) Instant Messaging.
10) Wireless access.
477Q. What is Web 2.0?
The term Web 2.0 was coined in 1999
Web
2.0 Web 2.0 was first introduced in the market by O’Reilly at the brainstorming discussion at media live International in 1999. The information available through Web 2.0 empowered the new generation to develop new concepts like Wiki, Widgets and Video streaming. It also allowed many users to publish their own content through few basic steps, which was not possible in the Web 1.0 or The Internet. Web 2.0 was responsible for the development of various sites that we commonly use today like Twitter, Flickr and Facebook.
Web
2.0 can be described in three parts:
Social Web
defines how Web 2.0 tends to interact much more with the
end user and make the end-user an integral part.
As such, Web 2.0 draws together the
capabilities of
the use of network protocols.
Standards-oriented web browsers may use plug-ins and software extensions to handle the
content and the user interactions. Key features of Web 2.0 include
- Folksonomy: Free Classification of Information
- Rich User Experience
- User as a Contributor
- Long Tail
- User Participation
- Basic Trust
- Dispersion
Web
2.0 sites provide users with
Creation, and
Dissemination capabilities
that were not possible in the environment
now known as "Web 1.0".
Web 2.0 websites include the following
features and techniques, referred to as the acronym SLATES by Andrew
McAfee:
Search
Finding
information through keyword search.
Links
Connects
information together into a meaningful information ecosystem using the model of
the Web, and provides low-barrier social tools.
Authoring
The ability
to create and update content leads to the collaborative work of many rather
than just a few web authors. In wikis, users may extend, undo and redo each
other's work. In blogs, posts and the comments of individuals build up over
time.
Tags
Categorization
of content by users adding "tags"—short, usually one-word
descriptions—to facilitate searching, without dependence on pre-made
categories. Collections of tags created by many users within a single system
may be referred to as "folksonomies" (i.e., folk taxonomies).
Extensions
Software
that makes the Web an application platform as well as a document server. These
include software like Adobe Reader, Adobe Flash player, Microsoft Silverlight,
ActiveX,
Oracle Java, QuickTime,
Windows Media, etc.
Signals
The use of
syndication technology such as RSS to notify users of content changes.
Marketing
For
marketers, Web 2.0 offers an opportunity to engage consumers.
Web 2.0
marketing strategies to compete with larger companies. As new businesses grow
and develop, new technology is used to decrease the gap between businesses and
customers.
Web 2.0 offers Networks
such as
Yelp and
are now
becoming common elements of multichannel and customer loyalty strategies, and
banks are beginning to use these sites proactively to spread their messages.
Web 2.0
technologies provide teachers with new ways to engage students, and even allow
student participation on a global level. Will Richardson stated in Blogs, Wikis, Podcasts
and other Powerful Web tools for the Classrooms.
Web 2.0 and
philanthropy
Web 2.0 in social
work
Web-based
applications and desktops
According to Best, the
characteristics of Web 2.0
are:
1.
Rich user experience,
2.
User participation,
3.
Dynamic content,
4.
metadata,
5.
Web standards and
6.
scalability.
Web 2.0
applications tend to interact much more with the end user. As such, the end
user is not only a user of the application but also a participant by:
1.
Podcasting
2.
Blogging
3.
Tagging
4.
Curating with RSS
Web
2.0, The popularity of the term, along with the increasing use of
1.
blogs,
2.
wikis, and
3.
social networking technologies,
has led many in academia and
business to append a flurry of 2.0's
To existing Web 2.0 concepts and fields of study,
including
1.
Library 2.0,
2.
Social Work 2.0,
3.
Enterprise 2.0,
4.
PR 2.0,
5.
Classroom 2.0,
6.
Publishing 2.0,
7.
Medicine 2.0,
8.
Telco 2.0,
9.
Travel 2.0,
10. Government
2.0, and even
11. Porn 2.0.
478Q. What
are the criticisms on Web 2.0?
The term Web 2.0 was never clearly defined and even today if
one asks ten people what it means one will likely get ten different definitions.
Critics of the term claim that "Web 2.0" does not represent a new
version of the World Wide Web at all, but merely continues to use
so-called "Web 1.0" technologies and concepts. First, techniques such as AJAX do not replace underlying protocols like HTTP, but add an additional layer of abstraction on top of them.
Second, many of the ideas of Web 2.0 were already featured in implementations on networked systems well before the term "Web 2.0" emerged.
Amazon.com, for instance, has allowed users to write reviews and consumer guides since its launch in 1995, in a form of self-publishing.
Amazon also opened its API to outside developers in 2002.
Previous developments also came from research in computer-supported collaborative learning and computer supported cooperative work (CSCW) and from established products like Lotus Notes and Lotus Domino, all phenomena that preceded Web 2.0.
"Nobody really knows what it means...If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along... Web 2.0, for some people, it means moving some of the thinking [to the] client side, so making it more immediate, but the idea of the Web as interaction between people is really what the Web is. That was what it was designed to be... a collaborative space where people can interact."
Other critics labeled Web 2.0 "a second bubble" (referring to the Dot-com bubble of circa 1995–2001), suggesting that too many Web 2.0 companies attempt to develop the same product with a lack of business models.
In terms of Web 2.0's social impact, critics such as Andrew Keen argue that Web 2.0 has created a cult of digital narcissism and amateurism, which undermines the notion of expertise by allowing anybody, anywhere to share and place undue value upon their own opinions about any subject and post any kind of content, regardless of their particular talents, knowledge, credentials, biases or possible hidden agendas.
Keen's 2007 book, Cult of the Amateur, argues that the core assumption of Web 2.0, that all opinions and user-generated content are equally valuable and relevant, is misguided.
Additionally, Sunday Times reviewer John Flintoff has characterized Web 2.0 as "creating an endless digital forest of mediocrity: uninformed political commentary, unseemly home videos, embarrassingly amateurish music, unreadable poems, essays and novels... [and that Wikipedia is full of] mistakes, half truths and misunderstandings".
Michael Gorman, former president of the American Library Association has been vocal about his opposition to Web 2.0 due to the lack of expertise that it outwardly claims, though he believes that there is hope for the future.
There is also a growing body of critique of Web 2.0 from the perspective of political economy.
As Tim O'Reilly and John Batelle put it, Web 2.0 is based on the "customers... building your business for you," critics have argued that sites such as Google, Facebook, YouTube, and Twitter are exploiting the "free labor" of user-created content.
Web 2.0 sites use Terms of Service agreements to claim perpetual licenses to user-generated content, and they use that content to create profiles of users to sell to marketers. This is part of increased surveillance of user activity happening within Web 2.0 sites.
Jonathan Zittrain of Harvard's Berkman Center for the Internet and Society argue that such data can be used by governments who want to monitor dissident citizens.
478Q.
What is Web 3.0?
Web 3.0 might be defined as a third-generation of the Web
enabled by the convergence of several key emerging technology trends:
Ubiquitous Connectivity
Ubiquitous Connectivity
- Broadband adoption
- Mobile Internet access
- Mobile devices
Network
Computing
- Software-as-a-service business models
- Web services interoperability
- Distributed computing (P2P, grid computing, hosted “cloud computing” server farms such as Amazon S3)
Open
Technologies
- Open APIs and protocols
- Open data formats
- Open-source software platforms
- Open data (Creative Commons, Open Data License, etc.)
Open
Identity
- Open identity (OpenID)
- Open reputation
- Portable identity and personal data (for example, the ability to port your user account and search history from one service to another)
The
Intelligent Web
- Semantic Web technologies (RDF, OWL, SWRL, SPARQL, Semantic application platforms, and statement-based datastores such as triplestores, tuplestores and associative databases)
- Distributed databases — or what I call “The World Wide Database” (wide-area distributed database interoperability enabled by Semantic Web technologies)
- Intelligent applications (natural language processing, machine learning, machine reasoning, autonomous agents)
Conclusion
Web
3.0 will be more connected, open, and intelligent, with semantic Web
technologies, distributed databases, natural language processing, machine
learning, machine reasoning, and autonomous agents.
479Q.
What is the difference between generator and motor?
If electrical energy is given the system that gives mechanical energy
is called motor.
If mechanical energy is given the system that gives electrical energy
is called generator.
480Q.
What is the range of Net?
No single
person owns the Internet.
No single
government has authority over its operations.
Some technical rules and
hardware/software standards enforce how people plug into the Internet, but for
the most part, the Internet is a free and open broadcast medium of hardware
networking.
481Q. What is Supercomputer performance?
PETAFLOP is expressed
as a thousand trillion operations per second.
10X X a thousand trillion operations per second = 10X X 1018
per second
Computer performance
|
|
Name
|
FLOPS
|
yottaFLOPS
|
1024
|
zettaFLOPS
|
1021
|
exaFLOPS
|
1018
|
petaFLOPS
|
1015
|
teraFLOPS
|
1012
|
gigaFLOPS
|
109
|
megaFLOPS
|
106
|
kiloFLOPS
|
103
|
482Q.
How can we enrich space in Web?
RAID (redundant array of independent disks, originally redundant
array of inexpensive disks) is a storage technology that combines multiple disk drive
components into a logical unit. RAID is now used as an umbrella term for computer data storage schemes that can divide and replicate data among multiple physical drives: RAID is an example of storage virtualization and the array can be accessed by the operating system as one single drive.
The different schemes or architectures are named by the word RAID followed by a number (e.g. RAID 0, RAID 1). Each scheme provides a different balance between the key goals: reliability and availability, performance, and capacity. RAID levels greater than RAID 0 provide protection against unrecoverable (sector) read errors, as well as whole disk failure.
483Q. What Are the Characteristics Of Supercomputers?
Supercomputers
are the fastest calculating devices ever invented.
A
desktop microcomputer processes data and instructions in millionths of a
second, or microseconds.
A
supercomputer, by contrast, can operate at speeds measured in nanoseconds and
even in picoseconds — one thousand to one million times as fast as microcomputers.
Examples
of are weather forecasting, oil exploration, weapons research, and large-scale
simulation.
The
chief difference between a supercomputer and a mainframe is that supercomputer
channels all its power into executing a few programs as fast as possible,
whereas a mainframe uses its power to execute many programs concurrently.
For
example, Y-MP/C90 made by Cray Research Inc. can perform as 2.1 billion mathematical
calculations per second. More powerful supercomputers use a technology called
massively parallel processing. These supercomputer consist of thousands of’
integrated microprocessors. One massively parallel computer built by Intel
Corporation is capable of performing 8.6 billion mathematical calculations per
second.
Supercomputers
have massive parallelism and very high computational speed. Often they have
specific hardware to handle large amounts of floating point operations or
vector operations.
Disadvantages are usually cost related.
Disadvantages are usually cost related.
They
are large, expensive to maintain, and require a lot of electrical power.
484Q. what are the types of
computer
1
PC, 2 Desktop, 3 Laptop, 4
Palmtop, 5 Workstation, 6 Server,
7
Mainframe, 8 Minicomputer, 9 Supercomputer, 10 Wearable.
485Q. What is variable speed?
A: Variable-speed
blower motors are designed to provide greater comfort through reduced initial
air velocities and noise.
When the unit first turns on, the
blower operates at low speed, which not only provides less noise than a
single-speed blower, but also allows the compressor and coil to ramp up before
the unit begins moving large volumes of air through the system.
486Q. Q: What is SEER and ton?
A: SEER (Seasonal Energy
Efficiency Ratio) is a measurement of a unit's efficiency, and ton is a
measurement of a unit's size.
487Q. Q: How does a heat pump work?
A: Heat pumps take heat from
the outside air and move it inside the house. The mild winters in Phoenix
provide plenty of heat in the outside air that you can use. In summer, the heat
pump moves heat from inside the house to the outside, providing efficient
cooling.
488Q. Are the super giants massive stars?
The masses of the super
giants are 30 times the mass of the Sun, although, the volumes of such stars
may be thousands of times the volume of the Sun.
No, as a rule, the
supergiant stars are all of a low surface temperature - about 3,000°C.
The largest stars are
about 3,000 times the diameter of the Sun. In kilometres, this is from about 5 thousand
million kilometres in diameter to about 160 million kilometres.
491Q.
How does an air conditioner (AC) work?
A: Air
conditioners perform two basic functions: heat removal and moisture removal.
Even in Arizona, we have a monsoon season with higher-than-normal humidity
levels. The lower the humidity level, the more comfortable you will feel at a
given temperature. As your warm indoor air is drawn up through the filter, it
passes over a very cold coil whereby the heat and moisture are removed. If
you've ever noticed a PVC pipe running off your roof that drips water, that is
the moisture removed from your home.
492Q: Is
it more economical to operate the fan on my air-conditioning unit continuously
or just turn on ceiling fans in the rooms in use?
A: Unless you are
using an electronic air filter that requires a continuous stream of air, you're
better off setting your unit's fan on "auto" and using ceiling fans
in occupied rooms.
493Q.
What are the differences between supercomputer and mainframe?
MAIN FRAMES
|
SUPERCOMPUTERS
|
Mainframes are large computers with great processing speed
and storage capabilities.
Mainframes
are used for problems which are limited by data movement in input/output devices
Mainframes
are designed to handle very high volume input and output
Mainframes
are measured in millions of instructions per second(MIPS)
Mainframe
deals with storing of large amount of data
insurance
business or payroll processing applications are more suited to mainframes.
Mainframes
typically form part of a manufacturer's standard model lineup.
Mainframe is
concerned with computing a large amount of data.
|
Introduced in 1960’s,
Supercomputers are the computers with fastest processing power.
Supercomputers
are sophisticated and expensive computers capable of processing trillions of
instructions in a second.
Supercomputers are used
for scientific and engineering problems which are limited by processing speed
and memory size.
Supercomputers
have multiple processing units, making its speed unimaginably fast.
Supercomputers
are measured in floating point operations per second (FLOPS) 1018.
Supercomputer
is much concerning with calculating one data in a very high speed.
Weather
forecasting is suited to supercomputers
Most
supercomputers can be one-off designs
we can say
that the Supercomputer is concerned much with speed
|
494Q. What are RAID numbers?
RAID stands for Redundant Array of Independent Disks
RAID 1 = mirroring without parity or striping
RAID 2 = bit-level striping with dedicated Hamming-code
parity
RAID 3 = (byte-level striping with dedicated parity
RAID 4 = block-level striping with dedicated parity
RAID 5 = block-level striping with distributed parity
RAID 6 = block-level striping with double distributed parity
RAID 10, often referred to as RAID 1+0 (mirroring
and striping),
495Q. Software-based RAID
Software RAID implementations are now provided by
many operating systems. Software RAID can be
implemented as:
- A layer that abstracts multiple devices, thereby providing a single virtual device (e.g. Linux's md)
- A more generic logical volume manager (provided with most server-class operating systems, e.g. Veritas or LVM)
- A component of the file system (e.g. ZFS or Btrfs)
496Q. Non-RAID drive architectures
Non-RAID drive architectures also
exist, and are often referred to, similarly to RAID, by standard acronyms,
several tongue-in-cheek. A single drive is referred to as a SLED (Single Large
Expensive Disk/Drive), by contrast with RAID, while an array of drives without
any additional control (accessed simply as independent drives) is referred to,
even in a formal context such as equipment specification, as a JBOD (Just a
Bunch Of Disks). Simple concatenation is referred to as a "span".
497Q.
What is called THz rays?
Terahertz radiation are:In physics, terahertz radiation refers to electromagnetic waves sent at frequencies in the terahertz range.
The term is normally used for the region of the electromagnetic spectrum between 300 gigahertz (3x1011 Hz) and 3 terahertz (3x1012 Hz), corresponding to the sub millimeter wavelength range between 1 millimeter (high-frequency edge of the microwave band) and 100 micrometer (long-wavelength edge of far-infrared light).
Like infrared radiation or microwaves, these waves usually travel in line of sight.
- the gyrotron,
- the backward wave oscillator ("BWO"),
- the far infrared laser ("FIR laser"),
- quantum cascade laser,
- the free electron laser (FEL),
- synchrotron light sources,
- photomixing sources, and
- single-cycle sources used in Terahertz time domain spectroscopy such as photoconductive, surface field, Photo-dember and optical rectification emitters.
498Q. What is the speciality of HML?
Web content
began as static HTML pages and evolved to include client-side scripting, proprietary content technologies, and application
programming interfaces.
HTML has
remained the basis of all Web content-until now. We are about to witness the revolutionary
move of content from HTML to XML (Extensible Markup Language).
XML is a set of
rules for defining a document using tags in a self-described vendor- and platform-neutral
manner.
XML has
numerous advantages over HTML. It is easily transformable and can describe any
type of content.
HTML is a
rendered presentation of data for a specific set of clients (namely HTML-based
browsers), while XML can be data, its presentation, or a combination of both.
Metaphorically
speaking, HTML is a picture of a 3D object (Data, Presentation, and Flow Logic)
while XML is the 3D object itself.
There has been
no equivalent to non-proprietary HTTP until recently, with the development of
XML.
Audio, data,
and video content can be described by metadata in XML.
499Q. What
is three-tier architecture in XML?
Three-tier
architecture has been the prevailing design for Internet systems during the
past few years. In this design, there are three primary components:
- Database,
- Application server, and
- Client.
Three-tier
architecture was considered an evolutionary step over the client server model.
500Q. What is called “Bleeding Edge” Technology?
Bleeding edge is
a term that refers to technology
that is so new (and thus, presumably, not perfected) that the user is required
to risk reductions in stability and productivity in order to use it. It also
refers to the tendency of the latest technology to be extremely expensive.
A technology may be considered
bleeding edge under the following conditions:
- Lack of consensus — competing ways of
doing some new thing exist and no one really knows for certain which way
the market is going to go.
- Lack of knowledge — organizations are
trying to implement a new technology or product that the trade journals
have not even started talking about yet, either for or against.
- Industry resistance to change
— trade journals and industry leaders have spoken against a new technology
or product but some organizations are trying to implement it anyway
because they are convinced it is technically superior.