For a long time now, I was wondering of protecting images especially as the world comes online with social networks offering extensive sharing and multimedia editing options.
We are all aware of image thefts and morphing. There are a few techniques like Divs + Css, copyright, watermarking and robots.txt which I could think of and a few more, thanks to Google :)
- Watermarking
indicating the ownership. There are tools like Photoshop and a combination
of html and css can also do the magic.
- Using Low Res Images
tampered with it should have color profiles defined and should have high
resolution. Several websites adopt the low resolution technique for basic
viewing. Eg. Flicker re-sizes images to as low as 500 pixels (in the longest dimension) for basic viewing
- Preventing Image Download: There are a few ways to do this.
annoy users and is not effective as screen capture will defeat the
purpose of image protection.
2. <Div> + CSS: This technique can be adopted for a start. But if used in
combination with Low Res Technique with watermark or copyrights
attached can offer better protection. This can be done in more than one
ways.
a. Single Div with transparent image and css.
Styles are added inline for reference purposes only. They should be added to a separate css file.
<html>
<style>
#OuterDiv
{
background-image: url('tulips.jpg');
height: 50%;
-webkit-user-select: none; -khtml-user-select: none; -moz-user-select: none; -o-
user-select: none; user-select: none;
user-select: none; user-select: none;
}
</style>
<div id="OuterDiv">
<img src="transparent.jpg"/>
</div>
</html>
b. Single Div with CSS
This will not let the user download the image. The styles should be
put into a separate css file. Styles are added inline for reference purposes only.
<html>
<style>
#OuterDiv
{
background-image: url('tulips.jpg');
height: 50%;
-webkit-user-select: none; -khtml-user-select: none; -moz-user-select: none; -o-
user-select: none; user-select: none;
user-select: none; user-select: none;
}
</style>
<div id="OuterDiv">
</div>
</html>
c. Two Divs with Css
This technique will dim-out the underlying image on hover by adding an extra div to
add contrast. If the user tries to make a screen capture while
add contrast. If the user tries to make a screen capture while
hovering over the image, the image will be dimmed out, so that the user does not
get a clear image. The drawback of this technique is the image is
get a clear image. The drawback of this technique is the image is
dimmed only when the user hovers the mouse pointer over it. The
work around for this would be to use javascript to capture keycodes
that are used to make a screen capture and produce a dim image.
Eg: In Windows, Print Screen key or Windows + S key and so on. I
cannot add it all to the following sample as it would make the
sample code too large.
<html>
<style>
#OuterDiv
{
background-image: url('tulips.jpg');
height: 50%;
-webkit-user-select: none; -khtml-user-select: none; -moz-user-select: none; -o-
user-select: none; user-select: none;
user-select: none; user-select: none;
}
#InnerDiv
{
background-color: transparent;
height:100%;
-webkit-user-select: none; -khtml-user-select: none; -
moz-user-select: none; -o-user-select: none; user-select: none;
position:relative;
}
#InnerDiv:hover, #OuterDiv:hover
{
background-color: rgba(65, 65, 61, 0.91);
}
</style>
<div id="OuterDiv">
<div id="InnerDiv"/></div>
</div>
</html>
d. Using <Table> tag and Css
<html>
<style>
#OuterDiv
{
background-image: url('tulips.jpg');
height: 50%;
-webkit-user-select: none; -khtml-user-select: none; -moz-user-select: none; -o-
user-select: none; user-select: none;
user-select: none; user-select: none;
}
</style>
<table id="OuterDiv">
<tr>
<td>Some text here. Just to view the table with its background image</td>
</tr>
</table>
</html>
I'm sure many more combinations are possible. The css image url can be encoded for additional protection.
I have not tried this way, but would like to give it a go.
- Prevent search engines from indexing images on your website
from crawling images in your site. We made use of this and
robots.txt in our BE project while implementing a Web Crawler and mini
search engine. Robots.txt understanding is not in current scope as it is a
separate topic in itself. In short, robots.txt tells a crawler which pages and
what content of your site can / should not be indexed.
1. Using meta-tags
<meta name="robots" content="noimageindex"></meta>
2. Using Robots.txt
Add the following in Robots.txt
User-Agent:*
Disallow:/images/
Disallow:/tulips.jpg
- Copyrights
Ofcourse, image theft still occurs where you need to take curative measures. I find
WHOIS particularly useful. There may be other resources as well to detect copyright infringes.
The most important fact is, the above techniques (there could be many more) should always be used in combination. Using a single technique will do little to help you from image thefts
WHOIS particularly useful. There may be other resources as well to detect copyright infringes.
The most important fact is, the above techniques (there could be many more) should always be used in combination. Using a single technique will do little to help you from image thefts
No comments:
Post a Comment