Commit 15460c93a0db195e2b4003d304456d76493469d7

Few meta changes
ISSUES.md
(0 / 11)
  
1Current Issues with Mouchak
2---------------------------
3
4- when plugins are used to manipulate existing images, event listeners might not work properly.
5This is beacuse there is no wrapper element around it. Hence plugins that will listen to the parent
6element of the image, will eventually end up listening to the page(that being directly the parent element
7of the images).(So clicking/hovering on the page triggers events, rather than the images).
8
9- need for wrapper elements of elements like images or tables ?
10
11- find a better way to load/eval js and css files ?
  
1<?xml version="1.0"?>
2<!DOCTYPE cross-domain-policy SYSTEM "http://www.adobe.com/xml/dtds/cross-domain-policy.dtd">
3<cross-domain-policy>
4 <!-- Read this: www.adobe.com/devnet/articles/crossdomain_policy_file_spec.html -->
5
6 <!-- Most restrictive policy: -->
7 <site-control permitted-cross-domain-policies="none"/>
8
9 <!-- Least restrictive policy: -->
10 <!--
11 <site-control permitted-cross-domain-policies="all"/>
12 <allow-access-from domain="*" to-ports="*" secure="false"/>
13 <allow-http-request-headers-from domain="*" headers="*" secure="false"/>
14 -->
15</cross-domain-policy>
humans.txt
(1 / 1)
  
1212 jQuery, Modernizr
1313 Underscore.js, Backbone.js
1414 Bootstrap
15
15
1616 Python, Flask, PyMongo
1717
1818 MongoDB
  
1<?xml version="1.0"?>
2<!DOCTYPE cross-domain-policy SYSTEM "http://www.adobe.com/xml/dtds/cross-domain-policy.dtd">
3<cross-domain-policy>
4 <!-- Read this: www.adobe.com/devnet/articles/crossdomain_policy_file_spec.html -->
5
6 <!-- Most restrictive policy: -->
7 <site-control permitted-cross-domain-policies="none"/>
8
9 <!-- Least restrictive policy: -->
10 <!--
11 <site-control permitted-cross-domain-policies="all"/>
12 <allow-access-from domain="*" to-ports="*" secure="false"/>
13 <allow-http-request-headers-from domain="*" headers="*" secure="false"/>
14 -->
15</cross-domain-policy>
  
1# robotstxt.org/
2
3User-agent: *
robots.txt
(0 / 3)
  
1# robotstxt.org/
2
3User-agent: *