they are useful for our purposes because they offer information about the inner
workings of our target application.
We will be focusing on Firefox since it is the default browser in Kali Linux. However, most
browsers include similar tools.
8.3.1
Debugging Page Content
A good place to start our web application information mapping is with a URL address. File
extensions, which are sometimes part of a URL, can reveal the programming language the
application was written in. Some extensions, like .php, are straightforward, but others are more
cryptic and vary based on the frameworks in use. For example, a Java-based web application
might use .jsp, .do, or .html.
File extensions on web pages are becoming less common, however, since many languages and
frameworks now support the concept of
routes
,
348
which allow developers to map a URI to a
348
(Wikipedia, 2022), https://en.wikipedia.org/wiki/Web_framework#URL_mapping
Penetration Testing with Kali Linux
PWK - Copyright © 2023 OffSec Services Limited. All rights reserved.
220
section of code. Applications leveraging routes use logic to determine what content is returned to
the user, making URI extensions largely irrelevant.
Although URL inspection can provide some clues about the target web application, most context
clues can be found in the source of the web page. The Firefox
Debugger
tool (found in the
Web
Developer
menu) displays the page’s resources and content, which varies by application. The
Debugger tool may display JavaScript frameworks, hidden input fields, comments, client-side
controls within HTML, JavaScript, and much more.
Let’s test this out by opening Debugger while browsing the
offsecwp
app:
Figure 101: Using Developer Tools to Inspect JavaScript Sources
We’ll notice that the application uses
jQuery
349
version 3.6.0, a common JavaScript library. In this
case, the developer
minified
350
the code, making it more compact and conserving resources,
which also makes it somewhat difficult to read. Fortunately, we can “prettify” code within Firefox
by clicking on the
Pretty print source
button with the double curly braces:
349
(jQuery, 2022), https://jquery.com/
350
(Wikipedia, 2022), https://en.wikipedia.org/wiki/Minification_(programming)
Penetration Testing with Kali Linux
PWK - Copyright © 2023 OffSec Services Limited. All rights reserved.
221
Figure 102: Pretty Print Source
After clicking the icon, Firefox will display the code in a format that is easier to read and follow:
Figure 103: Viewing Prettified Source in Firefox
We can also use the
Inspector
tool to drill down into specific page content. Let’s use Inspector to
examine the
search input
element from the WordPress home page by scrolling, right-clicking the
search field on the page, and selecting
Inspect
.
Penetration Testing with Kali Linux
PWK - Copyright © 2023 OffSec Services Limited. All rights reserved.
222
Figure 104: Selecting E-mail Input Element
This will open the Inspector tool and highlight the HTML for the element we right-clicked on.
Penetration Testing with Kali Linux
PWK - Copyright © 2023 OffSec Services Limited. All rights reserved.
223
Figure 105: Using the Inspector Tool
This tool can be especially useful for quickly finding hidden form fields in the HTML source.
Now that we have some familiarity with how to use the built-in browser debugger, we’ll learn how
to use the Network Tool and Burp Proxy to inspect HTTP response headers.
8.3.2
Inspecting HTTP Response Headers and Sitemaps
We can also search server responses for additional information. There are two types of tools we
can use to accomplish this task. The first type is a proxy, like Burp Suite, which intercepts
requests and responses between a client and a web server, and the other is the browser’s own
Network
tool.
We will explore both approaches in this Module, but let’s begin by demonstrating the
Network
tool. We can launch it from the Firefox
Web Developer
menu to review HTTP requests and
responses. This tool shows network activity that occurs after it launches, so we must refresh the
page to display traffic.
Figure 106: Using the Network Tool to View Requests
We can click on a request to get more details about it. In this case, we want to inspect response
headers. Response headers are a subset of
HTTP headers
351
that are sent in response to an
HTTP request.
351
(Mozilla, 2022), https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers
Penetration Testing with Kali Linux
PWK - Copyright © 2023 OffSec Services Limited. All rights reserved.
224
Figure 107: Viewing Response Headers in the Network Tool
The
Server
header displayed above will often reveal at least the name of the web server software.
In many default configurations, it also reveals the version number.
HTTP headers are not always generated solely by the web server. For instance,
web proxies actively insert the X-Forwarded-For
352
header to signal the web
server about the original client IP address.
Historically, headers that started with “X-” were called non-standard HTTP headers. However,
RFC6648
353
now deprecates the use of “X-” in favor of a clearer naming convention.
The names or values in the response header often reveal additional information about the
technology stack used by the application. Some examples of non-standard headers include
X-
Powered-By
,
x-amz-cf-id
, and
X-Aspnet-Version
. Further research into these names could reveal
352
(Mozilla, 2022), https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Forwarded-For
353
(IETF, 2012), https://www.rfc-editor.org/rfc/rfc6648
Penetration Testing with Kali Linux
PWK - Copyright © 2023 OffSec Services Limited. All rights reserved.
225
additional information, such as that the “x-amz-cf-id” header indicates the application uses
Amazon CloudFront.
354
Sitemaps
are another important element we should take into consideration when enumerating
web applications.
Web applications can include sitemap files to help search engine bots crawl and index their sites.
These files also include directives of which URLs
not
to crawl - typically sensitive pages or
administrative consoles, which are exactly the sort of pages we are interested in.
Inclusive directives are performed with the
sitemaps
355
protocol, while robots.txt excludes URLs
from being crawled.
For example, we can retrieve the robots.txt file from www.google.com with curl:
kali@kali:~$
Yüklə Dostları ilə paylaş: |